Você está na página 1de 78

Unit 4

1. Random Process
2. Markov Chain
3. Poisson Process

Chapter 1

Random Process

1.1

Classification of random processes

Consider a random experiment with outcomes s S where S is a sample space.


If to every outcomes s S , we assign areal value time function X(t, s) and the
collection of such time functions is called random process or stochastic process.

Definition 1.1.1 A random process is a collection of random variables {X(t, s)}


that are functions of a real variable, namely time t where s S and t T ( S :
sample space and T : parameter set or index set)

Types of random process


Definition 1.1.2 (Discrete random sequence) If both T and S are discrete,
the random process is called a discrete random sequence.

Example 1.1.3 If Xn represents the outcome of the nth toss of a fair die,
then {Xn , n 1} is a discrete random sequence, since T = {1, 2, 3, . . .} and
S = {1, 2, 3, 4, 5, 6} . If X(t) = the number of defective items found at trial
t = 1, 2, 3, . . . then {X(t)} is a discrete random sequence.

Definition 1.1.4 (Discrete random process) If T is continuous and S is


discrete, the random process is called a discrete random process.

Example 1.1.5 If X(t) represents the number of telephone calls received in the
interval (0, t) , then {X(t)} is a discrete random process, since S = {1, 2, 3, . . .} .
Also if X(t) = the number of defective items found at time t 0 , then {X(t)}
is a discrete random process.

Definition 1.1.6 (Continuous random sequence) If T is discrete and S is


continuous, the random process is called a continuous random sequence.

Example 1.1.7 If X(t) = amount of rainfall measured at trial, t = 1, 2, 3, . . . ,


then X(t) is a continuous random sequence.
3

Example 1.1.8 If Xn represents the temperature at the end of the nth hour of
a day, then Xn , 1 n 24 , is a continuous random sequence.
Definition 1.1.9 (Continuous random process) If both T and S are continuous, the random process is called a continuous random process.

Example 1.1.10 If X(t) represents the maximum temperature at a place in the


interval (0, t) , {X(t)} is a continuous random process.

Example 1.1.11 If X(t) = amount of rainfall measured at time t , t 0 , then


{X(t)} is a continuous random process.

Statistical Averages or Ensemble Averages


Let {X(t)} be a random process.
1. The mean of {X(t)} is defined by E[ X(t) ] = X (t) =

xfx (x, t)dx

where X(t) is treated as a random variable for a fixed value of t .


2. The auto correlation (A.C.F ) of X(t) is defined by
RXX (t1 , t2 ) = E[X(t1 )X(t2 )] =

R R

x1 x2 fx (x1 , x2 : t1 , t2 )dx1 dx2

3. The auto co-variance of {X(t)} is defined by


CXX (t1 , t2 ) = E[{X(t1 )X (t1 )}{X(t2 )X (t2 )}] = RXX (t1 , t2 )X (t1 )X (t2 )
where X (t1 ) = E[X(t1 )] and X (t2 ) = E[X(t2 )]

Definition 1.1.12 (Stationary processes) If certain probability distribution


or averages do not depend on t , then the random process {X(t)} is called stationary.

1.2

Types of stationary

Definition 1.2.1 (Strict Sense Stationary) (SSS)


A random process is called a strict sense stationary if all its finite dimensional
distribution are invariant under translation of time parameter.
fX (x1 , x2 , . . . , xn ; t1 , t2 , . . . , tn ) = fX (x1 , x2 , . . . , xn ; t1 +, t2 +, . . . , tn +) for
any t and any real number .

First order stationary


Definition 1.2.2 A random process {X(t)} is said to be a first order SSS process
if f (x, t1 + c) = f (x, t1 ) . That is the first order density of a stationary process
{X(t)} is independent of time t . Thus E[X(t)] = = a constant in a first order
stationary random process.

Second order stationary


Definition 1.2.3 A random process {X(t)} is said to be a second order stationary process if f (x1 , x2 , t1 , t2 ) = f (x1 , x2 , t1 + c, t2 + c) for any c . That is the
5

second order density must be invariant under translation of time.

Wide-Sense Stationary (or) Weakly Stationary (or) Co-variance Stationary

Definition 1.2.4 A random process {X(t)} is called a Wide-Sense Stationary


(W SS) if its mean is a constant and the auto-correlation depends only on the
time difference. That is, EX(t) = a constant and E[X(t)X(t + )] = RXX ( )
depends only on . It follows that auto co-variance of a W SS process also
depends only on the time difference . Thus CXX (t, t + ) = RXX ( ) 2X .

Definition 1.2.5 (Jointly WSS process) Two processes {X(t)} and {Y (t)}
are called Jointly W SS process if each is WSS and their cross correlation depends
only on the time difference ( ). That is RXY (t, t + ) = E[X(t)Y (t + )] =
R( XY )( ) . It follows that the cross co-variance of jointly WSS process {X(t)}
and {Y (t)} also depends only on ( ), the time difference. Thus CXY (t, t + ) =
RXY ( ) X Y .

Remark 1.2.6 For a two dimensional random process {X(t), Y (t) ; t 0} , we


define the following

1. Mean = E[X(t)] = X (t)


2. Auto correlation = E[X(t1 )X(t2 )]
6

3. Cross correlation = E[X(t1 )Y (t2 )]


4. Auto co-variance = E[{X(t1 ) X (t1 )}{X(t2 ) X (t2 )}]
5. Cross co-variance = E[{X(t1 ) X (t1 )}{Y (t2 ) Y (t2 )}]

Definition 1.2.7 A random process that is not stationary in any sense is called
an evolutionary random process.

1.3

Ergodic random process

Time averages of a random process


1. The time averaged of a sample function X(t) of a random process {X(t)}
is defined as X T = limT

1
2T

R T
T

X(t)dt .

2. The time - averaged auto correlation of the ramdon process {X(t)} is


defined by Z T = RXX (t, t + ) = limT

1
2T

R T
T

X(t)X(t + )dt .

Definition 1.3.1 (Ergodic random process) A random process {X(t)} is said


to be ergodic random process if its ensemble averages are equal to appropriate time
averages.

Definition 1.3.2 (Mean-Ergodic process) A random process {X(t)} is said


7

to be mean ergodic if ensembled mean is equal to time average mean. If E[X(t)] =


and X T = limT

1
2T

RT
T

X(t)dt , then = X T with probability one.

Definition 1.3.3 (Correlation Ergodic) A random process {X(t)} is said to


be correlation ergodic if ensembled A.C.F is equal to time averaged A.C.F.

Definition 1.3.4 (Cross Correlation Function) For a two dimensional random process {X(t), Y (t) : t 0} , the cross correlation function RXY ( ) is
defined by RXX (t, t + ) = E[X(t)Y (t + )] .

Properties of cross correlation.


(i) RY X ( ) = RXY ( ) .

p
(ii) RXY ( ) RXX ( )RY Y ( ) .


(iii) RXY ( ) 12 {RXX (0) + RY Y (0)} .
(iv) If the processes {X(t)} and {Y (t)} are orthogonal if RXY ( ) = 0 .
(v) If the process {X(t)} and {Y (t)} are independent, then RXY ( ) = X Y .

1.4

Examples of Stationary Processes

Example 1.4.1 Examine whether the Poisson process {X(t)} given by the probability law
P {X(t) = r} =

et (t)r
,
r!

r = 0, 1, 2, . . .

is not covariance stationary.

Solution: The Poisson process has the probability distribution as that of a


Poisson distribution with parameter t . Then E[X(t)] = t and E[X 2 (t)] =
2 t2 + t . Both are functions of t . Since E[X(t)] is not constant, the Poisson
process is not covariance stationary.

Example 1.4.2 Show that the random process X(t) = A sin(t + ) where A
and are constants, is a random variable uniformly distributed in (0, 2). is
first order stationary. Also find the auto correlation function of the process.

Solution: Given X(t) = A sin( + ) .


Claim: E[X(t)] is constant.
Z

E[X(t)] =

X(t)f () d

(1)

Since is uniformly distributed in (0, 2) ,

f () =

0 < < 2

otherwise

(2)

Substituting (2) in (1), we have


E[X(t)] =

R 2
0

1
X(t) 2
d

(3)

A
2

R 2

A
2

2
cos(t + ) 0

A
2

cos(t + 2) + cos(t)]

A
2

cos(t) + cos(t)] = 0 , a constant.

sin(t + ) d

Hence {X(t)} is a first order stationary process.


Find auto correlation function:
By definition,
RXX (t, t + ) = E[X(t)X(t + )]


= E A sin(t + ) A sin((t + ) + )


= A2 E sin(t + ) sin(+ + )

(4)

A2
2



E cos(t + (t + + )) cos(t + + + t)

A2
2



E cos( ) cos(2t + + 2)

A2
2


E cos( )]

A2
E[cos(2t
2

+ + 2)

Applying the formula of (3) in the second term of (5) we get


RXX (t, t + ) =

A2
2


E cos ]

A2
2

R 2
0

1
cos(2t + + 2) 2
d

10

(5)

A2
cos
2

A2
8



sin(2t + + 2 ]2
0

A2
cos
2

A2
8



sin(2t + ) sin(2t + )

A2
cos
2

Hence auto correlation function =

A2
cos
2

Example 1.4.3 Given a random variable Y with characteristic function () =


E[eiY ] = E[cos Y +i sinY ] and a random process defined by X(t) = cos(t+
Y ) , show that the random process {X(t)} is stationary in the wide sense if
(1) = (2) = 0 .
Solution:
E{X(t)}

= E{cos(t + Y )}
= E{cost cosY sint sinY }
= cost E[cosY ] sint E[sinY ]

(1)

Since (1) = 0 , E{cosY + isinY } = 0 .


Therefore E(cosY ) = E(sinY ) = 0

(2)

using(2) in (1), we get E{X(t)} = 0

(3)

E{X(t1 ) X(t2 )} = E{cos(t1 + Y ) cos(t2 + Y )}


= E{(cost1 cosY sint1 sinY )(cost2 cosY sint2 sinY )}
= E{cost1 cost2 cos2 Y + sint1 sint2 sin2 Y cost1 sint2 cosY sinY
sint1 cost2 cosY sinY )}
= cost1 cost2 E[cos2 Y ] + sint1 sint2 E[sin2 Y ] sin)t1 + t2 ) E[cosY sinY ]
11

= cost1 cost2 E[ 1+cos2Y


]+sint1 sint2 E[ 1cos2Y
] 21 sin)t1 +t2 ) E[sin2Y ]
2
2

(4)

Since (2) = 0 , E{cos 2Y + isin2Y } = 0 and so E[cos2Y ] = 0 = E[sin2Y ]

(5)

using (5) in (4), we get,


E{X(t1 )X(t2 )} = R(t1 , t2 ) = 21 cost1 cost2 + 21 sint1 sint2
= 12 cos(t1 t2 ) .

(6)

From (3) and (6), mean is a constant and ACF is a function of = t1 t2 only.
Hence {X(t)} is WSS process.

Example 1.4.4 Two random processes {X(t)} and {Y (t)} are defined by X(t) =
A cos t+B sin t and Y (t) = A cos tB sin t . Show that X(t) are jointly
WSS if A and B are uncorrelated random variables with zero means and the
same variances and is constant.

Solution: Given E(A) = E(B) = 0


V ar(A) = V ar(B) E(A2 ) = E(B 2 ) = k(say)
Since A and B are uncorrelated, E(AB) = 0
Let us prove that X(t) and Y (t) are individually WSS processes.
E[X(t)] = E[A cos t + B sin t] = cos t E(A) + sin t E(B) = 0 , a constant
R(t1 , t2 ) = E[X(t1 ) X(t2 )]
= E[(A cos t1 + B sin t1 )(A cos t2 + B sin t2 )]
= E[A2 cos t1 cos t2 +AB cos t1 sin t2 +AB sin t1 cos t2 +B 2 sin t1 sin t2 ]
12

= E(A2 ) cos t1 cos t2 + E(B 2 ) sin t1 sin t2


= k[cos t1 cos t2 + sin t1 sin t2 ] , since E(A2 ) = E(B 2 ) = k and
E(AB) = 0
R(t1 , t2 ) = kcos(t1 t2 ) ,a function of = t1 t2
Hence {X(t)} is a WSS process.
Now prove that Y (t) is a WSS process:
RXY (t1 , t2 ) = E[X(t1 ) Y (t2 )]
= E[(A cos t1 + B sin t1 )(B cos t2 A sin t2 )]
= E[AB cos t1 cos t2 A2 cos t1 sin t2 +B 2 sin t1 cos t2 AB sin t1 sin t2 ]
= E[B 2 sin t1 cos t2 A2 cos t1 sin t2 ]
= k sin (t1 t2 ) .
Therefore RXY (t1 , t2 ) is a function of = t1 t2 and so {X(t)} and {Y (t)}
are jointly WSS process.

Example 1.4.5 If {X(t)} is a WSS process with auto correlation R( ) = Ae| | ,


determine the second order moment of the RV X(8) X(5) .

Solution: The second moment of X(8) X(5) is given by


E[{X(8) X(5)}2 ] = E {X 2 (8)} + E {X 2 (5)} 2E {X(8)X(5)} 1
Given R( ) = Ae| | . Then R(t1 , t2 ) = Ae |t1 t2 |
E[X 2 (t)] = R(t, t) = A

13

E[X 2 (8)] = E[X 2 (5)] = A -2


E[X(8)X(5)] = R(8, 5) = Ae3 -3
Using 2 and 3 in 1, we get E[{X(8) X(5)}2 ] = 2A 2Ae3 = 2A(1 e3 ) .

Example 1.4.6 Show that the random process (t) = Acos(t + ) is a WSS
process if A and are constants and is a uniformly distributed random variable
in (0, 2) .

Proof. The Pdf of uniformal distribution is f () =

1
2

, 0 2

We have to show that the mean is a constant.


2

Z
E[x(t)] =

X(t)
0

1
=
2

1
d
2

Acos(+)d
0

A
[sin(t + )]02
2
A
[sint sint]
=
2

= 0, a constant.
Therefore Mean is a constant.
Now E[X(t) X(t + )] = RXX (t, t + ) = E[Acos(t + )Acos(t + + )]
= 12 E[2A2 cos(t + )cos(t + + )]
=

A2
E[cost
2

A2
cos
2

+ cos(2t + + 2)]
A2
E[cos(2t
2

14

+ + 2)]

R 2

A2
cos
2

A2
4

A2
cos
2



A2 sin(2t+ +2) 2
4
2
0

A2
cos
2

+ A2 4(0)

A2
cos
2

, a function only.

cos(2 + + 2)d

Therefore mean is a constant and Auto correlation function depends only and
so X(t) is a WSS process.

Example 1.4.7 Consider a random process {X(t)} defined by X(t) = Y cos(t+


) where Y and are independent random variables and are uniformly distributed over (A, A) and (, ) respectively.
a) Find E[X(t)] .
b) Find the auto correlation function RXX (t, t + ) of X(t) .
c) Is the process X(t) W.S.S.

Solution
(a) The p.d.f of Y is f (Y ) =
The p.d.f of is f () =

1
2

1
2

E[X(t)] = E[Y cos(t + )] = E[Y ]E[cos(t + )] .


Now E[Y ] =

RA

Y
A

1
2A

dY =

 
1 Y2 A
2A 2 A

1
4A


A2 A2 = 0 .

Therefore E[X(t)] = 0 , a constant.



(b) RXX (t, t + ) = E X(t) X(t + )] = E[Y cos(t + ) Y cos(t + + )]
15

= E[Y 2 ] E[cos(t + ) cos(t + + )]


Since V ar(Y ) = 2 = E[Y 2 ] (E[Y ])2 , 2 = E[Y 2 ] .
2
E
2

Therefore RXX (t, t + ) =



2 cos(t + ) cos(t + + )

2
E
2

2
2

cos +

2
E
2

2
2

cos +

2
4

2
2

cos +

2
8

sin(2t + + 2)

2
2

cos +

2
8

[0] =

cos(2t + + 2) + cos

Hence RXX ( ) =


cos(2t + + 2)

2
2

cos(2t + + 2) d

2
2

cos .

cos , a function of only.

(c) Yes, X(t) is W.S.S process, because mean is constant and RXX ( ) is function
of only.

Example 1.4.8 The probability distribution of the process {X(t)} is given by

(at)n1

(1+at)
if n = 1, 2, 3, . . .
n+1
P (X(t) = n) =

at
,n=0
1+at
Show that it is not stationary.

Solution

 P
E X(t) = n P (n) = 0

at
1+at

+1

1
(1+at)2

1
(1+at)2



1 + 2u + 3u2 + . . . , where u =

1
(1+at)2


2
1u
=

1
(1+at)2


at 2
1+at
16

(at)
at
+ 2 (1+at)
3 + 3 (1+at)4 + . . .
at
1+at

 1+atat 2
1
2
(1+at)
1+at

= 1,

Therefore E[X(t)] = 1 , a constant.


E[X 2 (t)] =

n2 P (n) =

1
(1+at)2

 P

1
(1+at)2

 P
2 n=1

1
(1+at)2


2 1

2(1+at)3
(1+at)2

n=1

n(n + 1)

n1
at
1+at

n1
n(n+1)
at
2
1+at

3
at
1+at

(1+at)2
(1+at)2

n=1 {n(n

n1

(at)
+ 1) n} (1+at)
n+1

n=1

n=1

2 
at
,
1+at

n1 
at
1+at

n1 
at
1+at

since (1 x)3 =

n=1

n(n+1) n1
x
2

= 2(1 + at) 1 = 1 + 2at .

Therefore V ar(X(t)) = 2at , a function of t and so {X(t)} is not stationary.

Example 1.4.9 Assume that a random process {X(t)} with four sample functions X(t, s1 ) = cos t ; X(t, s2 ) = cos t , X(t, s3 ) = sin t , X(t, s4 ) = sin t ,
all of which are equally likely. Show that it is WSS process.

Solution
E[X(t)] =

1
4

P4

i=1

X(t, si ) = 41 [cos t cos t + sin t sin t] = 0

A.C.F RXX (t1 , t2 ) = E[X(t1 ) X(t2 )] =

1
4

P4

i=1

X(t1 , si ) X(t2 , si )

1
4

cos t1 cos t2 + cos t1 cos t2 + sin t1 sin t2 + sin t1 sin t2

1
2


cos t1 cos t2 + sin t1 sin t2 = 21 cos(t2 t1 ) =

cos
2

, where = t2 t1

Since E[X(t)] is constant and RXX (t1 , t2 ) = a function of t2 t1 , the process


{X(t)} is WSS.

Example 1.4.10 Consider a random process {X(t)} = P + Qt , where P and


17

Q are independent random variables E(P ) = p , E(Q) = q , V ar(P ) = 12


and V ar(Q) = 22 . Find E {X(t)} , R(t1 , t2 ) and C(t1 , t2 ) . Is the {X(t)}
stationary?

Solution: Given the Random process X(t) = P+Qt. Then


E[{X(t)}] = E(P ) + tE(Q) = p + qt
A.C.F. R(t1 , t2 ) = E[X(t1 )X(t2 )]
= E[(P + Qt1 )(P + Qt2 )]
= E[P 2 + P Q(t1 + t2 ) + Q2 t1 t2 ]
= E[P 2 ] + E[P Q](t1 + t2 ) + E[Q2 ]t1 t2
Since P and Q are independent, E(P Q) = E(P ) E(Q)
E(P 2 ) = V (P ) + [E(P )]2 = 12 + p2
E(Q2 ) = V (Q) + [E(Q)]2 = 22 + q 2
Therefore R(t1 , t2 ) = 12 + p2 + pq(t1 + t2 ) + t1 t2 (22 + q 2 )
R(t1 , t2 ) = 12 + t1 t2 22 + p2 + t1 t2 q 2 + pq(t1 + t2 )
E[X 2 (t)] = E[P 2 + Q2 t2 + 2P Qt]
= E(P 2 ) + t2 E(Q2 ) + 2tE(P Q)
= 12 + p2 + t2 (22 + q 2 ) + 2tE(P )E(Q)
= 12 + p2 + t2 (22 + q 2 ) + 2tpq
V ar[X(t)] = E[X 2 (t)] E[{X(t)}]2
= 12 + p2 + t2 (22 + q 2 ) + 2tpq (P + qt)2 = 12 + t2 22
18

Since E[X(t)] and V ar[X(t)] are functions of time t , the Random process
{X(t)} is not stationary in any sense. It is evolutionary.
Auto covariance:
CXX (t1 , t2 ) = RXX (t1 , t2 ) E[{X(t1 )}] E[{X(t2 )}]
= 12 + t1 t2 22 + p2 + t1 t2 q 2 + pq(t1 + t2 ) (p + qt1 )(p + qt2 ) .
Therefore CXX (t1 , t2 ) = 12 + t1 t2 22 .
Example 1.4.11 If X(t) = Y cos t+Z sin t for all where Y and Z are independent binary random variables, each of which assumes the values 1 and 2 with
probabilities

2
3

and

1
3

respectively, prove that {X(t)} is wide sense stationary.

Solution: E(Y ) = (1)( 23 ) + 2( 31 ) = 0


E(Y 2 ) = (1)2 ( 23 ) + 4( 13 ) = 2
V ar(Y ) = E(Y 2 ) [E(Y )]2 = 2
Similarly, E(Z) = 0 ; V ar(Z) = 2
E[X(t)] = E[Y cos t + Z sin t] = E[Y ] cos t + E[Z] sin t = 0
RXX (t, t + ) = E[X(t)X(t + )]
= E[(Y cos t + Z sin t)(Y cos(t + ) + Z sin(t + ))]
= E[Y 2 cos t cos(t+ )]+E[Z 2 sin t sin(t+ )]+E[Y Z]cos t sin(t + ) + sin t cos(t + )
= E[Y 2 ] cos(t + ) cos t + E(Z 2 ) sin(t + ) sin t + E(Y )E(Z) sin(2t + )
= 2[cos(t + ) cos t + sin(t + ) sin t] + 0
Therefore RXX ( ) = 2 cos .
19

Since E[{X(t)}] is a constant and A.C.F is a function of only, {X(t)} is WSS


process.
Example 1.4.12 If X(t) = R cos(t + ) , where R and are independent
random variables with E(R) = 2 and V ar(R) = 6 and is uniformly distributed
in (, ) . Prove that {X(t)} is WSS process.
Solution: Since is uniformly distributed in (, ) , the P.d.f of is

2
if < <
f () =

0 otherwise.
R 1
(cos(t + )d
Now E[X(t)] = E(R)E[cos(t + )] = 2 2
= 1 [sin(t + )]
= 1 [sin(t + ) + sin( t)]
= 1 [ sin(t) + sin(t)] = 0
RXX (t, t + ) = E[X(t)X(t + )]
= E[R2 cos(t + ) cos(t + + )]
= E[R2 ]E[cos(t + ) cos(t + + )]
Since V ar(R) = 6 , we get E(R2 ) = V (R) + [E(R)]2 = 6 + 4 = 10
Therefore RXX (t, t + ) =

10
E[2 cos(t
2

+ ) cos(t + + )]

= 5E[cos(2t + + 2) + cos(t + )]
R

= 5 cos +

5
(2)

= 5 cos +

5
[sin(2t
(2)

cos(2t + + 2)d
+ + 2)] = 5 cos + 0
20

RXX (t, t + ) = 5 cos , a function of only.


Therefore E[X(t)] is a constant and A.C.F is a function of only, {X(t)} is a
W.S.S process.

Example 1.4.13 Given a random variable with density f () and another


random variable is uniformly distributed in (, ) and is independent of
and X(t) = a cos(t + ) , prove that {X(t)} is a WSS process.

Solution:
E[X(t)] = aE[cos(t + )]

= aE E cos(t + )/}]
= aE[cos tE(cos ) sin tE(sin )]
1
)
Therefore E[X(t)] = aE[cos t ( 2

1
cos d sin t ( 2
)

sin d]

= aE[cos t(0) sin t(0)] = 0 .


E[X(t1 )X(t2 )] = a2 E[cos(t1 + ) cos(t2 + )]


= a2 E[E cos t1 cos t2 cos2 + sin t1 sin t2 sin2 sin (t1 + t2 ) sin cos /]
o
n

R
1
2
2
= a E cos t1 cos t2 (2) cos d
o R
n

R

2
1
1
+ sin t1 sin t2 (2)
sin

sin
(t

1
+
t
)
sin 2 d
2
4

a2
E[cos t1
2

a2
E[cos (t1
2

cos t2 + sin t1 sin t2 ]


t2 )] . Therefore RXX (t1 , t2 ) is a function of t1 t2 whatever

be the value of f () . Hence {X(t)} is a W.S.S process.

21

Example 1.4.14 Show that the random process X(t) = A sin t + B cos t , where
A and B are independent random variables with zero means and equal standard
deviations is stationary of the second order.

Solution:
The A.C.F of a second order stationary process of a function of time difference
only and not on absolute time. Consider RXX (t, t + ) = E[X(t)X(t + )] .
Therefore RX X (t, t + ) = E[(A sin t + B cos t) (A sin(t + ) + B cos(t + ))]
= E[A2 ] sin(t+ ) sin t+E[B 2 ] cos(t+ ) cos t+E[AB] {sin t cos(t + ) + sin(t + ) cos t}
Given E(A) = E(B) = 0 E(AB) = 0 and E(A2 ) = E(B 2 ) = 2
Since V (A) = V (B) = 2 ,
Therefore RX X (t, t + ) = 2 [sin(t + ) sin t + cos(t + ) cos t]
= 2 cos(t + t)
= 2 cos
Therefore A.C.F is a function of time difference only. Hence {X(t)} is stationary of order two.

Example 1.4.15 Consider a random process Y (t) = X(t) cos(t + ) , where


X(t) is wide stationary and is uniformly distributed in ( , ) and is independent of X(t) and is a constant. Prove that Y (t) is wide sense stationary.

22

Solution:
E[Y (t)] = E[X(t)]E[cos(t + )]
n
o
R
1
= E[X(t)] (2)
cos(t
+
)d
= E[X(t)](0) = 0 .

RY Y (t, t + ) = E[X(t)X(t + )]E[cos(t + + ) cos(t + )]


=

RXX ( )
E[(cos )
2

+ cos(2t + + 2)] , since {X(t)} is W.S.S.

Therefore RY Y (t, t + ) =
=

RX X ( )
2

cos +

RX X ( )
[cos
2

RX X ( )
(0)
2

RX X ( )
2

1
(2)

cos(2t + + 2)d]

cos

Therefore A.C.F of Y (t) is a function of only .


Since E[Y (t)] is a constant and A.C.F of Y (t) is a function of only , {Y (t)}
is WSS process.

Example 1.4.16 Let X(t) = A cos t + B sin t , where A and B are independent normally distributed random variables N (0, 2 ) . Obtain the covariance
function of the process {X(t) : < t < } . Is {X(t)} covariance stationary?
Solution:
E[X(t)] = E[A] cos t + E[B] sin t
Since A and B are independent normal random variables N (0, 2 ) , E(A) =
E(B) = 0 and V (A) = V (B) = 2 .
and E(A2 ) = E(B 2 ) = 2 . Thus E[X(t)] = 0 .

23

The A.C.F of {X(t)} is given by


E[X(t)X(s)] = E[{A cos t + B sin t} {A cos s + B sin s}]
= E[A2 ] cos t cos s + E[B 2 ] sin t sin s + E[AB]{cos t sin s + sin t cos s}
Since E(A2 ) = E(B 2 ) = 2 and E(AB) = E(A)E(B) = 0 ,
R(t, s) = 2 {cos t cos s + sin t sin s} = 2 cos (t s)
Covariance C(t, s) = R(t, s) E[X(t)]E[X(s)] = R(t, s) = 2 cos (t s)
Since covariance function is a function of = ts only, and E[X(t)] is constant,
{X(t)} is covariance stationary.
Example 1.4.17 Consider a random process X(t) = B cos(50t + ) , where B
and are independent random variables. B is a random variable with mean 0
and variance 1 . is uniformly distributed in the interval ( , ). Show that
{X(t)} is WSS.
Solution:
E(B) = 0 and V ar(B) = 1 E(B 2 ) = 1 .
E[X(t)] = E[B]E[cos(50t + )] E[X(t)] = 0.
RX X (t1 , t2 ) = E[B cos(50t1 + )B cos(50t2 + )]
=

E[B 2 ]
E[2 cos(50t
2

1 + ) cos(50t2 + )]

= 12 E[cos 50(t1 t2 )] + 21 E[cos(50t1 + 50t2 + 2)]


RX X (t1 , t2 ) =
=

cos 50(t1 t2 )
2

cos 50(t1 t2 )
2

1
[sin(50t1
(8)

1
(4)

cos(50t1 + 50t2 + 2)d

+ 50t2 + 2)] =
24

cos 50(t1 t2 )
2

Therefore RX X (t1 , t2 ) = a function of time difference t1 t2 .


Since E[X(t)] is a constant and A.C.F is a function of = t1 t2 only, {X(t)}
is WSS.

Example 1.4.18 Consider a random process defined on a finite sample space


with three sample points and is defined by the three sample functions X(t, s1 ) = 3 ,
X(t, s2 ) = 3 cos t and X(t, s3 ) = 4 sin t and all of which are equally likely ie,
P (S1 ) =

1
3

for all i, compute the mean and a.C.F . Is tha process strict-sense

stationary? Is it wide sense stationary?

Solution:
Mean =

P3

i=1

X(t, si ) P (si ) = 31 (3) + 31 (3 cos t) + 13 (4 sin t) = 1 + cos t +

A.C.F = E[X(t1 )X(t2 )] =

P3

i=1

4 sin t
3

X(t1 , si )X(t2 , si ) P (si )

= 31 [9 + 9 cos t12 + 16 sin t12 ] = 3 + cos(t1 t2 ) + 37 sin t1 sin t2


RXX (t1 , t2 ) is not a function of t1 t2 only.
Since E[X(t)] is not a constant X(t) is not WSS. Also RXX (t1 , t2 ) is not a
function t2 t1 , X(t) is not S.S.S.

25

1.5

Examples of Ergodic Process

Example 1.5.1 If the WSS process {X(t)} is given by X(t) = 10 cos(100t + )


where is uniformly distributed over (, ) . Prove that {X(t)} is correlation
ergodic.

Solution:


RXX (t, t + ) = X(t)X(t + )


= E 10 cos(100t + 100 + ) 10 cos(100t + )


= 50E 2 cos(100t + 100 + ) cos(100t + )
= 50 cos(100 ) +

50
2

= 50 cos(100 ) +

50
4

cos(200t + 100 + 2) d

sin(200t + 100 + 2)

RXX (t, t + ) = 50 cos(100 ) +

50
[0]
4

= 50 cos(100 )

Consider the time averaged ACF


limT ZT = limT
= limT
= limT

RT
T
25
T

RT
T

X(t) X(t + ) dt

100 cos(100t + ) cos(100t + 100 + ) dt

RT
T

cos(100 )dt + limT

= 50 cos 100 + limT

25
T

 sin(200t+100 +2) T
8T

RT
T

cos(200t + 100 + 2) dt

= 50 cos(100 )

Therefore limT ZT = R(XX)( ) = 50 cos(100 )


Since the ensembled A.C.F= Time averaged ACF, {X(t)} is correlation ergodic.

26

Example 1.5.2 Prove that the random process {X(t)} defined by X(t) = A cos(t+
) where A and are constants and is uniformly distributed over (0, 2) is
ergodic in both the mean and the auto correlation function.

Solution:
Ensembled Mean and ACF
E[X(t)] =

A
2

R 2
0

cos(t + ) d =

A
2

2
sin(t + ) 0 =
A2
2



ACF RXX (t, t + ) = X(t)X(t + ) =
=

A2
2



E cos + cos(2t + + 2)

A2
2

cos +

A2
4

R 2

A2
2

cos +

A2
8


sin t sin t = 0



E 2 cos(t + + ) cos(t + )

cos(2t + + 2) d

sin(2t + + 2) d

Therefore RXX ( ) =

A
2

A2
2

2
0

A2
2

cos

cos

Time averaged ACF and Mean:


Time Averaged Mean = XT = limT
= limT

A
2T

RT
T

= limT

RT
T

X(t) dt

cos(t + ) dt
T

A sin(t+)
2T

1
2T

= 0.

Therefore XT = 0 .


Time averaged ACF x(t + )X(t) = limT
= limT

A2
2T

RT

= limT

A2
4T

RT

= limT

A2
4T

(cos )(2T ) + limT

1
2T

RT
T

x(t + )X(t) dt

A2 cos(t + + ) cos(t + ) dt
{cos + cos(2t + + 2)} dt
A2
4T

27

RT
T

cos(2t + + 2) dt

A2
2

cos + limT

A2
2

cos



A2 sin(2t+ +2) T
2T
2
T


Therefore x(t + )X(t) =

A2
2

cos

Since ensembled mean=Time aveaged mean, {X(t)} is mean ergodic. Also since
ensembled ACF=Time averaged ACF, {X(t)} is correlation ergodic.

Example 1.5.3 {X(t)} is the random telegraph signal process with E[X(t)] = 0
and R( ) = e2| | . Find the mean and variance of the time average of {X(t)}
over (T, T ) . Is it mean ergodic?

Solution:
Mean of the time average of {X(t)}
XT =

1
2T

RT
T

X(t) dt

Therefore E[ XT ] =

RT

1
2T

E[X(t)] dt = 0 , since E[X(t)] = 0 .

To find V ar(XT )
V ar(XT ) =
=

1
T

R 2T

1
T

1
2T

1
T

2T
0

e2 d

e2 2T
2
0

| | 
2T

1
2T 2

(1 e4 ) +

R 2T

1
T

1
2T

Therefore V ar(XT ) =

C( ) d =

2T

e2
2

e4T +

R 2T
0

2T

e2 d

e2 d

1
2T

1
T


e2 2T
42 0

1
82 T 2

1
82 T 2

(e4 1) .

(e4 1) .

Since limT V ar(XT ) = 0 , the process is mean ergodic.

28

Example 1.5.4 Let {X(t) : t 0} be a random process where X(t) = total

1
if k is even
.
number of points in the interval (0, t) = k say and X(t) =

1 if k is odd
Find the ACF of X(t) . Also if P (A = 1) = P (A = 1) =

1
2

and A is

independent of X(t) , find the ACF of Y (t) = A X(t) .

Solution:
Probability law of {X(t)} is given by P [X(t) = k] =

et (t)k
k!

, k = 1, 2, 3, . . .

Then P [X(t) = 1] = P [X(t) is even] = P [X(t) = 0] + P [X(t) = 2] + . . .



= et 1 +

t
2!

(t)2
4!


+ . . . = et cosh t .

P [X(t) = 1] = P [X(t) is odd] = P [X(t) = 1] + P [X(t) = 3] + . . .


= et

 t
1!

(t)3
3!


+ . . . = et sinh t .

P [X(t1 ) = 1, X(t2 ) = 1] = P [X(t1 ) = 1/X(t2 ) = 1] P [X(t2 ) = 1]


= (e cosh )(et2 cosh t2 ) , where = t1 t2
P [X(t1 ) = 1, X(t2 ) = 1] = (e cosh )(et2 sinh t2 )
P [X(t1 ) = 1, X(t2 ) = 1] = (e sinh )(et2 sinh t2 )
P [X(t1 ) = 1, X(t2 ) = 1] = (e sinh )(et2 cosh t2 ) .
Then P [X(t1 )X(t2 ) = 1] = e cosh and P [X(t1 )X(t2 ) = 1] = e sinh
Therefore R(t1 , t2 ) = 1 e cosh 1 e sinh = e2 = e2(t1 t2 ) .
To find ACF of Y (t) = AX(t)
E(A) = (1)P [X = 1] + (1)P [X = 1] =
29

1
2

1
2

=0

E(A2 ) = (1)2 P [X = 1] + (1)2 P [X = 1] =

1
2

1
2

=1

RY Y (t1 , t2 ) = E[A2 X(t1 )X(t2 )] = E(A2 )RXX (t1 , t2 ) = 1 e2 = e2(t1 t2 ) .

30

Chapter 2

Markov Process and Markov


chain

2.1

Basic Definitions

Definition 2.1.1 (Markov Process) A random process {X(t)} is called a Markov


process if P [X(tn ) = an /X(tn1 ) = an1 , X(tn2 ) . . . X(t2 ) = a2 , X(t1 ) = a1 ] =
P [X(tn ) = an /X(tn1 ) = an1 ] for all t1 < t2 < . . . tn . In other words, if the
future behavior of a process depends on the present value but not on the past, then
the process is called a Markov process.

Example 2.1.2 The probability of raining today depends only on previous weather
31

conditions existed for the last two days and not on past weather conditions.

Definition 2.1.3 (Markov Chain) If the above condition is satisfied for all n ,
then he process {Xn }; n = 0, 1, 2 . . . is called a Markov chain and the constants
(a1 , a2 , . . . , an ) are called the states of the Markov chain. In other words, a discrete parameter Markov process is called a Markov Chain.

Definition 2.1.4 (One-Step Transition Probability) The conditional transition probability P [Xn = aj /Xn1 = ai ] is called the one-step transition probability from state ai to state aj at the nth step and is denoted by Pij (n 1, n) .

Definition 2.1.5 (Homogeneous Markov Chain) If the one-step transition


probability does not depend on the step i.e., Pij (n 1, n) = Pij (m 1, m) , the
Markov chain is called a homogeneous Markov chain.

Definition 2.1.6 (Transition Probability Matrix) When the Markov chain


is homogeneous, the one-step transition probability is denoted by pij . The matrix
P = (pi j) is called the transition probability matrix (tpm) satisfying the conditions
(i) pij 0 and (ii)

pij = 1 for all i i.e., the sum of the elements of anyrow

of the t.pm is 1 .

Definition 2.1.7 (n-Step Transition Probability) The conditional probability that the process is in state aj at step n , given that it was in state ai at step
32

0, P [Xn = aj /X0 = ai ] is called n-step transition probability and is denoted by


(n)

(n)

Pij . That is Pij = P [Xn = aj /X0 = ai ]

Chapman-Kolmogorov theorem:
If P is the tpm of a homogeneous Markov chain,then n-step tpm P (n) is equal
(n)

to P n .Thus [Pij ] = [Pij ]n .

Definition 2.1.8 (Regular Markov Chain) A stochastic matrix P is said to


be a regular matrix if all the entries of P m (for some positive integer m) are
positive. A homogeneous Markov chain is said to be regular if its tpm is regular.

Definition 2.1.9 (Steady state distribution) If a homogeneous Markov chain


is regular, then every sequence of state probability distribution approaches a unique
fixed distribution of the Markov chain. That is limn {P (n) } = where
(1)

(2)

(n)}

P (n) = {p1 , p2 . . . pk

and = (1 , 2 , 3 , . . . k )

If P is the tpm of the regular Markov chain, and = (1 , 2 , 3 , . . . k ) is the


steady state distribution, then p = and 1 + 2 + 3 . . . + k = 1 .

2.2

Classification of states of Markov chain


(n)

Definition 2.2.1 If pij > 0 for some n and for all i and j , then every state
can be reached from every other state. When this condition is satisfied, the Markov
33

chain is said to be irreducible.The tpm of an irreducible chain is an irreducible


matrix. Otherwie the chain is said to be non-irreducible or reducible.

(n)

Definition 2.2.2 state i of a Markov chain is called a return state,i f pii > 0
for some n > 1 .

Definition 2.2.3 The period di of a return state i is defined as the greatest


(m)

common divisor of all m such that pii

(m)

> 0 i.e., di = GCD{m : pii

> 0}

state i is said to be periodic with period di if di > 1 and aperiodic if di = 1 .


Obviously state i is aperiodic if pii 6= 0

Definition 2.2.4 (Recurrence time probability) The probability that the chain
returns to state i , starting from state i , for the first time at the nth step is called
the recurrence time probability or the first return time probability. It is denoted
by fii (n) .
(n)

{n, fii } , n = 1, 2, 3, . . . is the distribution of recurrence time of the


state i .
If Fii =

n=1

(n)

fii = 1 , then return to state i is certain and ii =

n=1

(n)

nfii

is called the mean recurrence time of the state i .

Definition 2.2.5 A state i is said to be persistent or recurrent if the return to


state i is certain i.e., if Fii = 1 .
34

The state i is said to be transient if the return to state i is uncertain i.e.,


Fii < 1 .
The state i is said to be non-nullpersistent if its mean recurrence time ii is finite
and null persistent if ii = .
The non-null persistent and aperiodic state is called ergodic.

Remark 2.2.6

1. If a Markov chain is irreducible,all its states are of the same

type.They are all transient,all null persistent or all non-null persistent.All


its states are either aperiodic or periodic with the same period.
2. If a Markov chain is finite irreducible, all its states are non-null persistent.

Definition 2.2.7 (Absorbing state): A state i is called an absorbing state if


pij = 1 and pij = 0 for i 6= j .

Calculation of joint probability


P [X0 = a, X1 = b, . . . Xn2 = i, Xn1 = j, Xn = k]
= P [Xn = k/Xn1 = j]P [Xn1 = j/Xn2 = i] . . . P [X1 = b/X0 = a]P [X0 = a]
= Pjk Pij . . . Pab P (X0 = a)

35

2.3

Examples

Example
2.3.1 Consider
a Markov chain with transition probability matrix

0.5 0.4 0.1

P =
0.3 0.4 0.3 . Find the steady state probabilities of the system.

0.2 0.3 0.5


Solution : Let the invariant probabilities of P be = (1 , 2 , 3 )
By the property
of , p =

0.5 0.4 0.1

= (1 2 3 )
(1 2 3 )

0.3
0.4
0.3

0.2 0.3 0.5

0.51 + 0.32 + 0.23 = 1 0.51 0.32 0.23 = 0

(1)

0.41 + 0.42 + 0.33 = 2 0.41 0.62 0.33 = 0

(2)

0.11 + 0.32 + 0.53 = 3 0.11 + 0.32 0.53 = 0


(2)+(3) 0.5 1 -0.3 2 -0.2 3 =0 which is (1)
36

(3)

Since is the probability distribution 1 + 2 + 3 = 1 ,


(4)x 0.3 0.31 +0.3 2 +0.3 3 =0.3

(5)

(1)+(5) 0.81 +0.1 3 =0.3

(6)

(1)+(3) 0.61 -0.7 3 =0

(7)

(6)x7 5.61 +0.7 3 =2.1

(8)

(4)

2.1
= 0.3
(7)+(8) 6.21 =2.1 1 = 6.2

put 1 =0.3 in (7), 0.18-0.7 3 = 0 3 =

0.18
0.2

= 0.3

put 1 = 2 = 0.3 in (4) we get, 0.6+ 2 = 1 2 =0.4


Hence the invariant probabilities of P are (0.3, 0.4, 0.3) .

Example 2.3.2 At an intersection, a working traffic light will be out of order the
next day with probability 0.07 , and out of order traffic light will be working the
next day with probability 0.88 . Let Xn = 1 if a day n the traffic will work; Xn = 0
if on day n the traffic light will not work.Is {Xn ; n = 0, 1, 2 . . .} a Markov chain?.
If so, write the transition probability matrix.
Solution: Yes, {Xn ; n = 0, 1,
2 . . .} is a Markov
chain with state space {0, 1} .

0.12 0.88

.
Transition probability matrix

0.07 0.93
Example 2.3.3 Let {Xn } be a Markov chain with state space {0, 1, 2} with ini37

(0)
tial probability
vector P =
(0.7, 0.2, 0.1) and the one step transition probability

0.1 0.5 0.4

.
matrix P =

0.6
0.2
0.2

0.3 0.4 0.3


Compute P (X2 = 3) and P (X3 = 2, X2 = 3, X1 = 3, X0 = 2).

Solution: P (2)

0.1 0.5 0.4 0.1 0.5 0.4

=P =
0.6 0.2 0.2
0.6
0.2
0.2

0.3 0.4 0.3 0.3 0.4 0.3

0.43 0.31 0.26

P =

0.24
0.42
0.34

0.36 0.35 0.29


(i) P [X2 = 3] =

P3

i=1

P [X2 = 3/X0 = i]P [X0 = i]


38

given P (0) = (0.7, 0.2, 0.1)


This gives P [X0 = 1] = 0.7 ; P [X0 = 2] = 0.2 and P [X0 = 3] = 0.1
(2)

(2)

(2)

Therefore P [X2 = 3] = P13 P (X0 = 1) + P23 (X0 = 2) + P33 P (X0 = 3)


= 0.26 0.7 + 0.34 0.2 + 0.29 0.1
= 0.182 + 0.068 + 0.029 = 0.279
(ii) P {X1 = 30 = 2} = P23 = 0.2

(1)

P {X1 = 3/X0 = 2} = P {X1 = 3/X0 = 2} P {X0 = 2}


= 0.2 0.2 = 0.04 (by(1))

(2)

P {X2 = 3, X1 = 3, X0 = 2} = P {X2 = 3/X1 = 3, X0 = 2} P {X1 = 3, X0 = 2}


= P {X2 = 3/X1 = 3} P {X1 = 3, X0 = 2}
(by Markov property)
= 0.3 0.04 = 0.012 (by(2))
P {X3 = 2, X2 = 3, X1 = 3, X0 = 2} = P {X3 = 2/X2 = 3, X1 = 3, X0 = 2}
P{X 2 = 3, X1 = 3, X0 = 2}
= P {X3 = 2/X2 = 3} P {X2 = 3, X1 = 3, X0 = 2}
= 0.4 0.012 = 0.0048 .
Example 2.3.4 A fair dice is tossed repeatedly. If Xn denotes the maximum of
the number occuring in the first n tosses,find the transition probability matrix P
of the Markov chain {Xn } . Find also P 2 and P (X2 = 6) .
Solution: State space {1, 2, 3, 4, 5, 6} . Let Xn = the maximum of the numbers
39

occuring in the first n trails = 3(say)


Xn+1 =3 if the (n+1)th trail results in 1,2 or 3
=4 if the (n+1)th trail results in 4
=5 if the (n+1)th trail results in 5
=6 if the (n+1)th trail results in 6
Therefore P {Xn+1 = 3/Xn = 3} =
P {Xn+1 = 3/Xn = 3} =

1
6

1
6

+ 16 +

1
6

when i= 4,5,6

The transition
probabilitymatrix of the chain is

P =

1
6

2
6

1
1
1
1
6
6
6
6

3
1
1
1
6
6
6
6

4
1
1
0 6 6 6

0 0 56 16

0 0 0 1
1
6

1
6

1
6

1
6

40

3
6

1
P 2 = 36

3 5 7 9 11

4 5 7 9 11

0 9 7 9 11

0 0 16 9 11

0 0 0 25 11

0 0 0 0 36

Initial state probability distribution is P (0) = ( 16 , 61 , 16 , 61 , 16 , 61 ) since all the values


1, 2, 3 . . . , 6 are equally likely.
P {X2 = 6} =

P6

i=1

P {X2 = 6/X0 = i} x P {X0 = i}

P6

1
6

1 1
(11
6 36

91
216

i=1

Pi62
+ 11 + 11 + 11 + 11 + 36)

Example 2.3.5 Three girls G1 , G2 , G3 are throwing ball to each other G1 always throws the ball to G2 and G2 always throws the ball to G3 , but G3 is just
41

as likely to throws the ball to G2 as to G1 . Prove that the process is Markovian.Find the transition matrix and classify the states.

Solution:

The transition probability matrix of the process {Xn } is given by

0 1 0

0 0 1 . States of Xn depend only on states of Xn1 , but not on states of

1 1 0
2
2
Xn2 , Xn3. . . or earlier
states.
{Xn } is a Markov chain.
Therefore

0 0 0
1

2
3

Now P =
1 1 0 , P = 0
2 2

0 1 1
1
2

(3)

(2)

(2)

1
2

1
2

1
4

1
2

1
2

(2)

(2)

(1)

P11 > 0 , P13 > 0 , P21 > 0 , P22 > 0 , P33 > 0 and all other Pij > 0 .
Therefore the chain is irreducible.

42

4
P =
1
4

1
4

(2)

1
2

1
4

1
2

(3)

1
1
4
2

,P5 =
1
1
4
2

1
1
4

(5)

(6)

Pii , Pii , Pii , Pii

1
4

1
2

3
8

1
1
4
2

and P 6 =
1
1
4
4

1
1
2

1
2

3
8

3
8

1
4

1 and so on.
2

3
8

etc are > 0 for i = 1, 2, 3 and the GCD of 2, 3, 5, 6 . . . = 1

Therefore the state 1 (state G1 ) is periodic with period 1(aperiodic).

Example
2.3.6
Find the nature of the states of the Markov chain with the tpm

0 1 0

P = 1
1 .
0
2
2

0 1 0

Solution :

43

0 1 0 0 1 0 1 0 1

2
2

P = 1
1 1
1 =

0
1
0
0
0
2

2 2
2

0 1 0 0 1 0 1 0 1
2
2

1 0 1 0 1 0 1 0 1
2
2
2
2

P 3 = P 2 .P =
0 1 0 1 0 1 0 1 0 = P

1 0 1 0 1 0 1 0 1
2
2
2
2

1 0 1 1 0 1 1 0 1
2
2
2 2
2
2

4
2
2

P = P .P =
0 1 0 0 1 0 = 0 1 0 = P and so on.

1 0 1 1 0 1 1 0 1
2
2
2
2
2
2
In general, P 2n = P 2 , P 2n+1 = P
(2)

(1)

(2)

Also, P00 > 0 , P01 > 0 , P02 > 0

44

(1)

(2)

(1)

(2)

(1)

(2)

P10 > 0 , P11 > 0 , P12 > 0


P20 > 0 , P21 > 0 , P22 > 0 .
Therefore, the Markov chain is irreducible.
(2)

(4)

(6)

Also Pii = Pii = Pii . . . > 0 , for all i, all the states of the chain are periodic,
with period 2. Since the chain is finite and irreducible, all its states are non-null
persistent. All states are not ergodic.

Example 2.3.7 A gambler has Rs 2. He bets Rs.1 at a time and wins Rs 1 with
probability

1
2

. He stops playing if he loses Rs 2 or wins Rs 4.

(a) What is the tpm of the related Markov chain?


(b) What is the probability that he lost his money at the end of 5 plays?
(c)What is the probability that the game lasts more than 7 plays?

Solution: Let Xn represent the amount with the player at the end of the nth
round of the play. State space of {Xn } = (0, 1, 2, 3, 4, 5, 6) , as the game ends, if
the player loses all the money (Xn = 0) or wins Rs.4 that is has Rs.6 (Xn = 6) .

45

1
2

(a)The tpm of the Markov chain is P =


0

0 0 0 0 0

1
2

0 0 0

1
2

1
2

0 0

1
2

1
2

0 0

1
2

1
2

0 0 0

1
2

0 0 0 0 0

Since the player


0

1
2

has got Rs.2 initially the initial probability distribution of {Xn } is

46

P (0)

1
2

=
0 0 1 0 0 0 0
0

0 0 0 0 0

1
2

0 0 0

1
2

1
2

0 0

1
2

1
2

0 0

1
2

1
2

0 0 0

1
2

0 0 0 0 0

47


=
0
0

1
2

1
2

1
2

0 0 0

1
2

0 0 0
0

P2 = PP =
0

1
2

1
2

0 0 0 0 0

1
2

0 0 0

1
2

1
2

1
2

1
2

0 0

1
2

1
2

0 0 0

1
2

0 0 0 0 0

similiary, P 2 = P 3) P =
1
4

1
4

3
8

1
8

P 4 = P 3P =
3 0
8

5
16

1
4

16

P (5) = P (4) P =
3

5
32

9
32

1
8

0 0

1
16

48


=
1 0
0

1
2

1
2

1
4

0 0

P (6) = P (5) P =
29 0
64

7
32

13
64

0 18

P (7) = P (6) P =
29

64

7
64

27
128

13
128

1
8

(b) P{the man has lost money at the end of 5 plays}


= P {X5 = 0}
=the entry corresponding to state 0 in P (5)
=

3
8

(c) P{the game lasts more than 7 days}


=P{the system is neither in state 0 nor in 6 at the end of the seventh round}
= P {X7 , 1, 2, 3, 4or5}
=

7
64

14+27+13
128

+0+

27
128

+0+

54
128

13
128

27
64

Example 2.3.8 On a given day, a retired professor Dr.Charles Fish, amuses


himself with only one of the following activities. Reading(activity 1), gardening(activitity 2)or working on his book about a river vally(activity 3). For 1
i 3 , let Xn = i if Dr. Fish devotes day 0 n0 to activity i. Suppose that
{Xn : n = 1, 2, 3 . . .} is a Markov chain and depending on which of these ac-

49

0.30 0.25 0.45

.
tivities on the next day is given by the tpm P =

0.40
0.10
0.50

0.25 0.40 0.35


Solution: Let 1 , 2 and 3 be the proportion of days Dr.Fish devotes to reading, gardening and writing respectively. Then,( 1 2 3 )P = (1 2 3 )

0.30 0.25 0.45

( 1 2 3 )
0.40 0.10 0.50 = (1 2 3 )

0.25 0.40 0.35


0.301 + 0.402 + 0.253 = 1

. . . (1)

0.251 + 0.102 + 0.403 = 2

. . . (2)

0.451 + 0.502 + 0.353 = 3

. . . (3)

1 + 2 + 3 = 1

. . . (4)

from (4), we have 3 = 1 1 2

. . . (5)

substituting 3 in (1) we get,


50

0.301 + 0.402 + 0.25(1 1 2 ) = 1


0.301 + 0.402 + 0.25 0.251 0.252 = 1
0.051 + 0.152 1 = 0.25
0.951 + 0.152 = 0.25

. . . (6)

subtituting 3 in (2) we get,


0.251 + 0.102 + 0.40(1 1 2 ) = 2
0.251 + 0.102 + 0.40 0.401 0.402 = 2
0.151 1.302 = 0.40

. . . (7)

(6) 26 + (7) 3 [0.95 26 0.15 3]1 = 0.25 26 0.40 3


[24.70 0.45] 1 = 6.5 1.20
-25.15 1 =-7.70
1 =0.3060
substituting 1 in (6) we get,
-0.95(0.306)+ 0.15 2 =-0.25
0.15 2 =0.04
2 =0.267
3 = 1 1 2
=1-0.306-0.267
3 =0.427
Thererfore Dr.Charles devotes approximately 30 percentage of the days to read-

51

ing, 27 percentage of the days to gardening and 43 percentage of the days to


writing.

Example
2.3.9

The tpm of a Markov chain with three states 0,1,2 is

3
4

P =
1
4

0
1
,
3

1
4

1
2

3
4

1 and the initial state distribution of the chain is P [X0 = i] =


4

1
4

i = 0, 1, 2.

Find (i) P [X2 = 2]


(ii) P [X3 = 1, X2 = 2, X1 = 1, X0 = 2] (iii). P [X2 = 1, X0 = 0] .

3
4

Solution: Given P =
1
4

1
4

1
2

3
4

1
4

1
4

52

P (2)

3
4

=
1
4

3
0
4

1 1
4 4

1
0

1
4

1
2

3
4

1
4

1
2

3
4



5
0
18






1 = 5

4
16





1
3
4

16

1
16

3 .
16

5
16

8
16

9
16

16

From the definition of conditional probability,


(i) P [X2 = 2] =

P2

i=0

P [X2 = 2/X0 = i]P [X0 = i]

= P [X2 = 2/X0 = 0]P [X0 = 0] + P [X2 = 2/X0 = 1]P [X0 = 1]


+ P [X2 = 2/X0 = 2]P [X0 = 2]

5
18

2
P =
5
16

3
16

5
16

8
16

9
16

1
16

3
16

4
16

2
2
2
P [X2 = 2] = P02
P [X0 = 0] + P12
P [X0 = 1] + P22
P [X0 = 2]
1
+
= 13 [ 16

3
16

4
] = 61
16

53

3
4

(ii) P =
1
4

1
4

1
2

3
4

1
4

1
4

P [X3 = 1.X2 = 2, X1 = 1, X0 = 2] = P [X3 = 1/X2 = 2]


P [X2 = 2/X1 = 1]P [X1 = 1/X0 = 2]P [X0 = 2]
(1)

(1)

(1)

= P21 P12 P21 P [X0 = 2]


=

3131
4443

3
= 64
(2)

From P 2 ,we get P [X2 = 1/X0 = 0] = P01 =

5
16

P [X2 = 1; X0 = 0] = P[ X2 = 1/X0 = 0]P [X0 = 0] =

5 1
16 3

5
48

Example 2.3.10 There are 2 white balls in bag A and 3 red balls in bag B. At
each step of the process,a ball is selected from each bag and the 2 balls selected are
interchanged. Let the state ai of the system be the number of red ball in A after
i change. What is the probability that there are 2 red balls in A after 3 steps? In
the long run, what is the probability that there are 2 red balls in bag A?

Solution: State space of the chain {Xn } = (0, 1, 2) , since the number of balls
in the bag A is always 2. Let the transition probability matrix of the chain {Xn }

54

P00 P01 P02

be P =

P
P
P
11
12
10

20 P21 P22
P00 = 0 [the state 6= 0 , interchange of balls]
P02 = P20 = 0 (After the process of interchanging,the number of red balls in bag
cannot increase or decrease by 2)
A 0 red balls(before interchange)
A 1 red balls(after interchange)
Let Xn =1, that is A contains 1 red ball(and 1 white ball) and B contains 1 white
and 2 red balls.
P {Xn+1 = 0/Xn = 1} = P10 = 12 13 =
P12 = 12 23 =

1
6

1
3

Since P is a stochastic matrix, P10 + P11 + P12 = 1


P11 = 21
P21 = 32
and P22 = 1 (P20 + P21 ) = 13

55

0 1 0

Therefore P = 1 1 1

6 2 3

0 2 1
3

Now P (0) = (1, 0, 0) as there is no red ball in A in the beginning.


P (1) = P (0) P = (0, 1, 0)
P (2) = P (1) P = ( 16 , 12 , 13 )
1 23 5
, 26 , 18 )
P (3) = P (2) P = ( 12
(3)

5
P { there are 2 red balls in bag A after 3 steps } = P {X3 = 2} = P2 = 18
. Let

the stationary probability distribution of the chain be = (0 , 1 , 2 ) . By the


property of
, we haveP = and 0 + 1 + 2 =1

0 1

0 1 0

1
1 =
1

2 6 2 3
0 1 2

0 2 1
3

1
6

= 0

0 +

1
2

22
3

= 1
56

1
3

2
3

= 2

Therefore 2 = 61 ;6 1 +3 2 +4 3 =6 2 and 2 =2 3
Therefore 3 1 = 3 ; 2 =2 3 ; 1 + 2 + 3 = 1
3 13 + 23 + 3 = 1 103 = 3
Therefore 3 =

3
10

, 2 =

1
,
distribution is = ( 10

6
10

and 1 =

6
,
10

3
10

1
10

. Therefore the steady state probability

). Thus in the long run, 30 percentage of the

time, there will be two red marbles in urn A.

2.4

Exercise

Two marks questions


1. Define Markov chain and one-step transition probability.
2. What is meant by steady-state distribution of Markov chain.
3. Define Markov process and example.
4. What is stochastic matrix? when it is said to be regular?
5. Define irreducible Markov chain and state Chapman-Kolmogorov theorem.
6. Find the invariant probabilities for the Markov chain [Xn ; n 1] with state

57

1 0

.
space S = [0, 1] and one-step TPM P =

1 1
2

7. At an intersection, a working traffic light will be out of order the next day
with probability 0.07, and out of order traffic light will be working the next day
with probability 0.88. Let Xn = 1 if a day n the traffic will work; Xn =0 if on
day n the traffic light will not work. Is {Xn ; n = 0, 1, 2 . . .} a Markov chain?. If
so, write the transition probability matrix.
8. The
tpm ofa Markov chain with three states 0,1,2 is

3
4

P =
1
4

1
4

1
2

3
4

1 and the initial state distribution of the chain is P [X0 = i] = 3 ,


4

1
4

i=0,1,2. Find P [X3 = 1, X2 = 2, X1 = 1, X0 = 2] .


9. Define recurrent state, absorbing state and transient state of a Markov chain.
10. Define regular and ergodic Markov chains.

58

Choose the Correct Answer


(1). All regular Markov chains are
(a)ergodic (b)Stationary (c)WSS (d)None
Ans:(a)
(2). If a Makkov chain is finite irreducible,all its state are
(a)Transient (b)null persistent (c)non-null persistent (d)return state.
Ans:(c)
(3). If di =1,then state i is said to be
(a)periodic (b)return state (c)recurrent (d)aperiodic
Ans:(d)
(4). A non null persistent and aperiodic state is
(a)regular (b)ergodic (c)transient (d)mean recurrence time
Ans:(b)
(5). The sum of the elements of any row in the transition probability matrix of
a finite state Markov chain is
(a) 0 (b) 1 (c) 2 (d) 3
Ans:(b)

59

Chapter 3

Poisson Process

3.1

Basic Definitions

Definition 3.1.1 If X(t) represents the numbers of occurrences of certain events


in (0, t) , then the discrete random process {X(t)} is called the Poisson process,
provided the following postulates are satisfied,
(i) P [1 occurrence in (t, t + t)] = t
(ii) P [0 occurrence in (t, t + t)] = 1 t
(iii) P [2 or more occurrence in (t, t + t)] = 0
(iv) X(t) is independent of the number of occurrences of the event in any interval
prior and after the interval (0, t)
(v) The probability that the event occurs a specified number of times in (t0 , t0 + t)
60

depends only on t , but not on t0

Example 3.1.2 (i) The arrival of a customer at a bank.


(ii) The occurance of lighting strike within some prescribed area.
(iii) The failure of some component in a system.
(iv) The emission of an electron from the surface of a light sensitive material.

Probability law for the Poisson Process {X(t)}


Let be the number of occurrences of the event in unit time.
Let Pn (t) = P [X(t) = n] Pn (t + t) = P [X(t + t) = n]
= P [(n 1) calls in (0, t) and 1 callin (t, t + t)] +
P [n calls in (0, t) and no call in (t, t + t)] .
Therefore Pn (t + t) = Pn1 (t)t + Pn (t)(1 t)
Pn (t + t) Pn (t) = Pn1 (t)t Pn (t)t = t [Pn1 (t) Pn (t)]
Therefore

Pn (t+t)Pn (t)
t

= [Pn1 (t) Pn (t)]

Taking limit as t 0,
limt0

Pn (t+t)Pn (t)
t

= limt0 [Pn1 (t) Pn (t)]

Pn0 (t) = [Pn1 (t) Pn (t)] -(1)


Let the solution of the equation (1) be Pn (t) =
Differentiate (2) with respect to 0 t0
0

Pn (t) =

n
[ntn1 f (t)
n!

+ tn f 0 (t)] (3)

Using (2) and (3) in (1),


61

(t)n
f (t)
n!

-(2)

We get

n
[ntn1 f (t)
n!
(t)n 0
f (t)
n!

Therefore

n1

+ tn f 0 (t)] = (t)
f (t) (t)
f (t)
(n1)!
n!
n

= (t)
f (t)
n!

f 0 (t) = f (t)
f 0 (t)
f (t)

Therefore

Integrating, log f (t) = t + log k f (t) = ket


From (2),
P0 (t) = f (t)i.e., f (0) = P0 (0) = P [X0 (0) = 0]
= P [ no event occurs in (0, 0)] = 1 .
But f (0) = k Therefore k = 1
Hence f (t) = et
Therefore P [X(t) = n] = Pn (t) =

et(t)n
,n
n!

= 0, 1, 2...

This is the probability law for Poisson process. It is to be observed that the
probability distribution of X(t) is the Poisson distribution with parameter t .

Mean of the Poisson process:


Mean = E[X(t)] =
=

n=0

n=0

nPn (t)

n(t)n t
e
n!
tn
n=0 n1!

= et

= et

t
1

2 t2
1!

h
= tet 1 +

t
1!

3 t3
2!

2 t2
2!

i
+ ...
i
+ ...

= tet et = t

62

Variance of the Poisson Process:


Variance= E[X 2 (t)] [E[X(t)]]2 -(1)
E[X 2 (t)] =

n=0

n2 Pn (t) =

n=0

n2 et(t)
n!

Now n2 = n(n + 1) + n
Hence E[X 2 (t)] =
=

n=0

= et

n=0

n(n1)et (t)n
n!

[n(n1)+n]et (t)n
n!

n=0

net (t)n
n!

n=0
2

(t)
(n2)!
+ t

)
+
= et [ (t)
0!

(t)3
1!

+ ...] + t

= et 2 t2 et + t
= 2 t2 + t
Hence V ar[X(t)] = 2 t2 + t (t)2 = 2 t2 + t 2 t2 = t

Auto Correlation of the Poisson Process:


RXX (t, t + ) = E[X(t)X(t + )] (or) RXX (t1 , t2 ) = E[X(t1 )X(t2 )]
= E[X(t1 ){X(t2 ) X(t1 ) + X(t1 )}]
= E[X(t1 )]E[X(t2 X(t1 ))] + E[X 2 (t1 )]
Since {X(t)} is a process of independent increment, we get
RXX (t1 , t2 ) = t1 (t2 t1 ) + t1 + 2 t21 , if t2 t1
RXX (t1 , t2 ) = 2 t1 t2 + min(t1 , t2 )
63

Auto Co-variance of the Poisson Process:


C(t1 , t2 ) = R(t1 , t2 ) E[X(t1 )]E[X(t2 )]
= 2 t1 t2 + t1 2 t1 t2 = t1 if t2 t1 .
Therefore C(t1 , t2 ) = min{t1 , t2 } .

3.2

Properties of the Poisson Process

Property 1: Poisson process is a Markov process.


Proof:
Consider P [X(t3 ) = n3 /X(t2 ) = n2 , X(t1 ) = n1 ] =
=

P [X(t1 )=n1 ,X(t2 )=n2 ,X(t3 )=n3 ]


P [X(t1 )=n1 X(t2 )=n2 ]
e(t3 t2 ) n3 n2 (t3 t2 )n3 n2
(n3 n2 )!

= P [X(t3 ) = n3 /X(t2 ) = n2 ]
This means that the conditional probability distribution of X(t3 ) given all the
past values X(t1 ) = n1 , X(t2 ) = n2 depends only on the most recent value
X(t2 ) = n2 . Therefore Poisson processes the Markovian property. Hence Poisson
process is a Markov Process.
Property 2: The sum of independent Poisson processes is a Poisson process.
Proof:
P [X(t) = n] =

Pn

k=0 P [X1 (t) = k] P [X2 (t) = nk] =

64

Pn

k=0

e1 t (1 t)k e2 t (2 t)nk
k!
(nk)!

=
P [X(t) = n] =

e(1 +2 )t
n!

Pn

n!
k=0 k! (nk)!

e(1 +2 )t
(1 t
n!

(1 t)k (2 t)nk

+ 2 t)n , n = 0, 1, 2, . . .

Therefore X(t) = X1 (t) + X2 (t) is a Poisson process with parameter (1 + 2 )t .


Hence the sum of two independent Poisson processes is also a Poisson process.
Property 3: The difference of two independent Poisson processes is not a Poisson process.
Proof:
Let X(t) = X1 (t) X2 (t) . Then
E[X(t)] = 1 t 2 t = (1 2 )t
= E[X12 (t) 2X1 (t)X2 (t) + X22 (t)]
= E[X 2 (t)] 2E[X( t) X2 (t)] + E[X22 (t)]
= 21 t2 + 1 t + 2 t2 + 2 t 21 2 t2
Since X1 (t) and X2 (t) are independent, E[X 2 (t)] = (1 +2 )t+[(1 2 )t]2
We know that E[X 2 (t)] for a Poisson process with parameter t is given by
E[X 2 (t)] = t + 2 t2
Since X(t) = X1 (t) X2 (t) , E[X 2 (t)] = 1 2 )t + (1 2 )2 t2
Expression (1) shows that X1 (t) X2(t) is not a Poisson process.

65

(1)

Property 4: The inter arrival time of a Poisson process i.e., the interval between
two successive occurrences of a Poisson process with parameter has an exponential distribution with mean

i.e., with parameter .

Proof:
Let two consecutive occurrences of the event be Ei and Ei+1
Let Ei take place at time instant ti and T be the interval between the occurrences of Ei and Ei+1 .
T is a continuous random variable.
P [T > t] = P {Ei did not occur in (ti , t+1 )}
= P [ no event occurs in an interval of length t]
= P [X(t) = 0] = et
Therefore cumulative distribution function of T is given by
F (t) = P [T t] = 1 P [T < t] = 1 et
Therefore the p.df of T is the exponential distribution with parameter , given
by f (t) = et

(t 0) .

i.e., T has an exponential distribution with mean

Property 5: If the number of occurrences of an event E in an interval of length


t is a Poisson process {X(t)} with parameter and if each occurrence of E has
a constant probability p of being recorded and the recordings are independent
of each other, then the number N (t) of the recorded occurrences n t is also a

66

Poisson process with parameter p .


Hence P [N (t) = n] =

3.3

ept (pt)n
,
n!

n = 0, 1, 2, . . .

Example

Example 3.3.1 If {N1 (t)} and {N2 (t)} are two independent Poisson process
with parameters 1 and 2 respectively, show that P [N1 (t) = k/N1 (t) + N2 (t) =
n] nCr pk q nk where p =

1
1 +2

and

2
1 +2

Solution: By definition, P [N1 (t) = r] =

.
e1 t (1 t)r
r!

, r = 0, 1, 2 . . . and

P [N2 (t) = r] = e2 t (2 t)r r! , r = 0, 1, 2 . . . .


P [N1 (t) = k/N1 (t) + N2 (t) = n] =

P [N1 (t)=kN1 (t)+N2 (t)=n]


P [N1 (t)+N2 (t)=n]

P [N1 (t)=kN2 (t)=nN1 (t)]


P [N1 (t)+N2 (t)=n]

P [N1 (t)=kN2 (t)=nk]


P [N1 (t)+N2 (t)=n]

P [N1 (t)=k]P [N2 (t)=nk]


P [N1 (t)+N2 (t)=n]

n!e1 t (1 t)k e2 t (t2 )nk


k!(nk)!e(1 +2 )t [(1 +2 )t]n
k nk

= nCk (11+2 2 )n
1
2
= nCk ( 1+
)k ( 1+
)nk
2
2

Taking p =

1
1 +2

, q =1p=

2
1 +2

, we have

P [N1 (t) = k/N1 (t) + N2 (t) = n] = nCk pk q nk .


Example 3.3.2 A radio active source emits particles at a rate 6 per minute in
67

accordance with Poisson process. Each particle emitted has a probability

1
3

of

being recorded. Find the probability that at least 5 particles are recorded in a 5
minute period.

Solution: Let N(t)be the number of recorded particles. Then {N (t)} is a Poisson process with p as parameter. Now, p = 6( 31 ) = 2
P [N (t) = n] =

e2t (2t)n
n!

, n = 0, 1, 2 . . .

Therefore P[at least 5 particles are recorded in a 5 minute period]= P [X(5) 5]


= 1 P [X(5) < 5]
=1- {P [X(5) = 0] + P [X(5) = 1] + P [X(5) = 2] + P [X(5) = 3] + P [X(5) = 4]}
= 1 e10 [1 + 10 +

102
2!

103
3!

104
]
4!

= 0.9707 .

Example 3.3.3 If customers arrive at a counter in accordance with a Poisson


process with a rate of 3 per minute, find the probability that the interval between
2 consecutive arrivals is

1. more than 1 minute


2. between 1 minute and 2 minutes
3. 4 minutes or less

Solution: By property 4 of the Poisson process, we have the interval T between


68

2 consecutive arrivals follows an exponential distribution with parameter = 3 .


The pdf of the exponential distribution = ex .

1. P (T > 1) = 3

R
1

e3t dt

 
= 3 e33t 1
= e3
2. P (1 < T < 2) =
3. P (T 4) = 3

R4
0

R2
1

 2
3e3t dt = 3 e33t 1 = e3 e6

e3t dt

 4
= 3 e33t 0
= 1 [e12 e0 ] = 1 e12 .

Example 3.3.4 A machine goes out of order, whenever a component fails. The
failure of this part follows a Poisson process with a mean rate of 1 per week. Find
the probability that 2 weeks have elapsed since last failure. If there are 5 spare
parts of this component in an inventory and that the next supply is not due in 10
weeks, find the probability that the machine will not be out of order in the next
10 weeks.

Solution:Here the unit time t = 1 week.


Mean failure rate = mean number of failures in the week i.e., = 1

69

1. P[no failures in the 2 weeks since last failure]= P [X(2) = 0]


=e

2(20 )

0!

= e2 = 0.135
2. There are only 5 spare parts and the machine should not go out of order in
the next 10 weeks.
Therefore P[for this event]= P [X(10) 5]
=

e10 (10)n
n!

P5

n=0

= e10 [1 + 10 +

102
2!

103
3!

104
4!

105
]
5!

= 0.068 .

Example 3.3.5 What will be the superposition of n independent Poisson processes with respective average rates 1 , 2 , . . . , n .

Solution: The super position of n independent Poisson processes with average


rates 1 , 2 . . . n is another Poisson process with average rate 1 , 2 . . . n .

Example 3.3.6 If {X(t)} is a Poisson process, prove that


P [X(s) = r/X(t) = n] = nCr ( st )r (1 st )nr where s > t .

Solution:
P [X(s) = r/X(t) = n] =
=

P [X(s)=rX(t)=n]
P [X(t)=n]

P [X(s)=rX(ts)=nr]
P [X(t)=n]

70

P [X(s)=r]P [X(ts)=nr]
P [X(t)=n]

since {X(t)} is a process of independent increments.


s (s)r

Therefore P [X(s) = r/X(t) = n] = { e

r!

}{ e

(ts) [(ts)]nr

sr (ts)nr
n!
r!(nr)!
tn

nr
n!
( s )r ( ts
)
r!(nr)! t
t

(nr)!

} etn!(t)n

Hence P [X(s) = r/X(t) = n] = nCr ( st )r (1 st )nr


.

Example 3.3.7 A fisherman catches fish at a Poisson rate of 2 per hour from a
large lake with lots of fish.If he starts fishing at 10:00am,what is the probability
that he catches one fish by 10:30 am and 3 fish by noon?

Solution: Let N (t) be the total number of fish caught at or before time 0 t0 .
Then N (0) ; {N (t) : t 0} has stationary and independent increments.
Therefore {N (t) : t 0} is a Poisson process. Here = 2
P [N (t) = n] =

e2t (2t)n
n!

, n = 0, 1, 2 . . .

P [N ( 21 ) = 1 and N ( 32 ) = 2] = P [N ( 21 ) = 1]P [N ( 32 ) = 2]
1

= { e1! }{ e

3 32

2!

= 0.082

71

3.4

Birth and Death Process:

Now we will go to the random process that has wide applications in several fields
of natural phenomena such as spread of queuing problems, telephone exange,
traffic maintenance, epidemics, and population growth. The name of the process
is birth and death process.

Definition 3.4.1 If X(t) represents the number of individuals present at time


t in a population[or the size of the population of time t]in which two types of
events occur-one representing birth which contributes to its increase and the other
representing death which contributes to its decrease, then the discrete random
process {X(t)} is called the birth and death process, provided the two events
namely birt and death are governed by the following postulates:

(i) P [1 birth in (t, t + t)] = n (t) t + 0(t)


(ii) P [0 birth in (t, t + t)] = 1 n (t) t + 0(t)
(iii) P [2 or more births in (t, t + t)] = 0(t)
(iv) Births occurring in (t, t + t)] are independent of time since last birth.
(v) P [1 death in (t, t + t)] = n (t) t + 0(t)
(vi) P [0 death in (t, t + t)] = 1 n (t) t + 0(t)
72

(vii) P [2 or more deaths in (t, t + t)] = 0(t)


(viii) Deaths occurring in (t, t + t) are independent of time since last death.
(ix) The birth and death occur independently of each other at any time.

Probability Distribution of X(t)


Let Pn (t) = P [X(t) = n] = probability that the size of the population is n at
time t. Then Pn (t + t) = P {X(t + t) = n} = probability that the size of the
population is n at time t + t . Now the event X(t + t) = n can happen in any
one of the following four mutually exclusive ways:

(i) X(t) = n and no birth or death in (t, t + t)


(ii) X(t) = n 1 and 1 birth and no death in (t, t + t)
(iii) X(t) = n + 1 and no birth and 1 death in (t, t + t)
(vi) X(t) = n and 1 birth and 1 death in (t, t + t)

Therefore
Pn (t + t) = Pn (t)(1 n t)(1 n t) + Pn1 (t)n1 t(1 n1 t)
+ Pn+1 (t)(1 n+1 t)n+1 t + Pn (t)n (t)tn (t)t
Pn (t + t) = Pn (t) (n + n )Pn (t)t + n1 Pn1 (t)t + n+1 Pn+1 (t)t
Omitting terms containing (t)2
73

Pn (t + t) Pn (t) = [(n + n )Pn (t) + n1 Pn1 (t) + n+1 Pn+1 (t)]t


Therefore

Pn (t+t)Pn (t)
t

= n1 Pn1 (t)(n +n )Pn (t)+n+1 Pn+1 (t)

(1)

Taking limits on both sides of(1)as t 0 ,we get


limt0

Pn (t+t)Pn (t)
t

= limt0 [n1 Pn1 (t) (n + n )Pn (t) + n+1 Pn+1 (t)]

Pn (t) = n1 Pn1 (t) (n + n )Pn (t) + n+1 Pn+1 (t)

(2)

The expression(2)holds good for n 1 . It is not valid when n = 0 as no death


is possible in (t, t + t) and X(t) = n 1 = 1 is not possible.
Therefore P0 (t + t) = P0 (t)(1 0 t) + P1 (t)(1 1 t)1 t
P0 (t + t) P0 (t) = P0 (t)0 t + P1 (t)(1 1 t)1 t
P0 (t + t) P0 (t) = P0 (t)0 t + P1 (t)1 t 1 1 (t)2
Omitting terms containing (t)2
Therefore P0 (t + t) P0 (t) = P0 (t)0 t + P1 (t)1 t
P0 (t+t)P0 (t)
t

= P0 (t)0 + P1 (t)1

(3)

Taking limits as t in (3), we get


limt0

P0 (t+t)P0 (t)
t

= limt0 [P0 (t)0 + P1 (t)1 ]

P0 (t) = 0 P0 (t) + 1 P1 (t)

(4)

On solving equations (2) and (4), we get Pn (t) = P [X(t) = n],


probability distribution of X(t) .

74

n 0 , the

Steady State Solution:


dPn (t)
dt

The steady state solution of this process can be obtained by setting


Then the above differential-difference equations (2) and (4) reduce into

(n + n )Pn (t) = n1 Pn1 (t) + n+1 Pn+1 (t) 0 P0 (t) + 1 P1 (t) = 0


Using Pn (t) = Pn , Pn1 (t) = Pn1 and Pn+1 (t) = Pn+1 , we have
(n + n )Pn = n1 Pn1 + n+1 Pn+1 (t) , n 1

(5)

0 P0 = 1 P1

and

(6)

These equations (5) and (6) are known as the balance equations.
By re-arranging the equation(5), we get
n Pn n+1 Pn+1 = n1 Pn1 n Pn = . . . = 0 P0 1 P1
But from (6), 0 P0 = 1 P1 P1 =

0
P
1 0

It follows that n1 Pn1 = n Pn and hence Pn =

n1
Pn1
n

This is true for n 1 . From (7), we have the following,


P2 =
Therefore P2 =

1
P
2 1

Proceeding, Pn =

2
P
3 2

0 1 2
P
1 2 3 0

P3 =

0 1 2 ...n1
P0
1 2 3 ...n

Pn = P0
P

1 0
P
2 1 0

0 1
P
1 2 0

P3 =

Since

P2 =

Qn1

i
i=0 i+1

, n1

n0 Pn = 1 , we have P0 [1 +

Therefore P0 =

(8)

P Qn1
n=1

1
P
Qn1 i
1+
)
n=1
i=0 (

i
i=0 i+1 ] =1.

(9)

i+1

75

(7)

=0.

Therefore the steady state solutions of the Birth-Death process are given by
Pn (t) = P0 (t)
P0 (t) =

Qn1

i
i=0 i+1

, n 1 and

1
Qn1 i
P
1+
)
n=1
i=0 (
i+1

Thus the limiting distribution (P0 , P1 . . .) is now completely determined and


these are non-zero, provided that the series

3.5

P Qn1
n=1

i
i=0 i+1

converges.

Exercise

Choose the correct answer


1. The Poisson process is
(a)Stationary (b) Markovian (c)Continuous random process (d)None
Ans:(b)
2. Correlation coefficient of the Poisson process is
q
q
(a) tt12 (b) tt21 (c) t (d) None
Ans:(a)
3. Consider a computer system with Poisson job arrival stream at an average
rate of 60 per hour.What is the probability that the time interval between
successive job arrivals is longer than 4 minutes.
(a)0.9997

(b)0.1328

(c)0.0183

76

(d)0.1353

Ans:(c)
4. A birth and death process is a
(a)Ergodic

(b)finite Markovian

(c)Stationary

(d)None

Ans:(b)
5. Which one of the following is Poisson process
(a)The arrival of a customer at a bank (b)Random walk with reflecting barriers (c)The time duration between consecutive cars passing a fixed location
on a road.
Ans:(a)

Two marks questions


1. Define Poisson process and example.
2. What are the postulates of a Poisson process.
3. Prove that Poisson is a Markov process.
4. Prove that difference at two independent Poisson process
is not a Poisson process.
5. Define Birth and Death process.
6. Prove that the sum of independent Poisson processes is a Poisson process.
77

7. What will be the superposition of n independent Poisson processes with


respective average rates 1 , 2 , . . . , n .

78

Você também pode gostar