Escolar Documentos
Profissional Documentos
Cultura Documentos
1. Random Process
2. Markov Chain
3. Poisson Process
Chapter 1
Random Process
1.1
Example 1.1.3 If Xn represents the outcome of the nth toss of a fair die,
then {Xn , n 1} is a discrete random sequence, since T = {1, 2, 3, . . .} and
S = {1, 2, 3, 4, 5, 6} . If X(t) = the number of defective items found at trial
t = 1, 2, 3, . . . then {X(t)} is a discrete random sequence.
Example 1.1.5 If X(t) represents the number of telephone calls received in the
interval (0, t) , then {X(t)} is a discrete random process, since S = {1, 2, 3, . . .} .
Also if X(t) = the number of defective items found at time t 0 , then {X(t)}
is a discrete random process.
Example 1.1.8 If Xn represents the temperature at the end of the nth hour of
a day, then Xn , 1 n 24 , is a continuous random sequence.
Definition 1.1.9 (Continuous random process) If both T and S are continuous, the random process is called a continuous random process.
R R
1.2
Types of stationary
Definition 1.2.5 (Jointly WSS process) Two processes {X(t)} and {Y (t)}
are called Jointly W SS process if each is WSS and their cross correlation depends
only on the time difference ( ). That is RXY (t, t + ) = E[X(t)Y (t + )] =
R( XY )( ) . It follows that the cross co-variance of jointly WSS process {X(t)}
and {Y (t)} also depends only on ( ), the time difference. Thus CXY (t, t + ) =
RXY ( ) X Y .
Definition 1.2.7 A random process that is not stationary in any sense is called
an evolutionary random process.
1.3
1
2T
R T
T
X(t)dt .
1
2T
R T
T
X(t)X(t + )dt .
1
2T
RT
T
Definition 1.3.4 (Cross Correlation Function) For a two dimensional random process {X(t), Y (t) : t 0} , the cross correlation function RXY ( ) is
defined by RXX (t, t + ) = E[X(t)Y (t + )] .
1.4
Example 1.4.1 Examine whether the Poisson process {X(t)} given by the probability law
P {X(t) = r} =
et (t)r
,
r!
r = 0, 1, 2, . . .
Example 1.4.2 Show that the random process X(t) = A sin(t + ) where A
and are constants, is a random variable uniformly distributed in (0, 2). is
first order stationary. Also find the auto correlation function of the process.
E[X(t)] =
X(t)f () d
(1)
f () =
0 < < 2
otherwise
(2)
R 2
0
1
X(t) 2
d
(3)
A
2
R 2
A
2
2
cos(t + ) 0
A
2
cos(t + 2) + cos(t)]
A
2
sin(t + ) d
(4)
A2
2
E cos(t + (t + + )) cos(t + + + t)
A2
2
E cos( ) cos(2t + + 2)
A2
2
E cos( )]
A2
E[cos(2t
2
+ + 2)
A2
2
E cos ]
A2
2
R 2
0
1
cos(2t + + 2) 2
d
10
(5)
A2
cos
2
A2
8
sin(2t + + 2 ]2
0
A2
cos
2
A2
8
sin(2t + ) sin(2t + )
A2
cos
2
A2
cos
2
= E{cos(t + Y )}
= E{cost cosY sint sinY }
= cost E[cosY ] sint E[sinY ]
(1)
(2)
(3)
(4)
(5)
(6)
From (3) and (6), mean is a constant and ACF is a function of = t1 t2 only.
Hence {X(t)} is WSS process.
Example 1.4.4 Two random processes {X(t)} and {Y (t)} are defined by X(t) =
A cos t+B sin t and Y (t) = A cos tB sin t . Show that X(t) are jointly
WSS if A and B are uncorrelated random variables with zero means and the
same variances and is constant.
13
Example 1.4.6 Show that the random process (t) = Acos(t + ) is a WSS
process if A and are constants and is a uniformly distributed random variable
in (0, 2) .
1
2
, 0 2
Z
E[x(t)] =
X(t)
0
1
=
2
1
d
2
Acos(+)d
0
A
[sin(t + )]02
2
A
[sint sint]
=
2
= 0, a constant.
Therefore Mean is a constant.
Now E[X(t) X(t + )] = RXX (t, t + ) = E[Acos(t + )Acos(t + + )]
= 12 E[2A2 cos(t + )cos(t + + )]
=
A2
E[cost
2
A2
cos
2
+ cos(2t + + 2)]
A2
E[cos(2t
2
14
+ + 2)]
R 2
A2
cos
2
A2
4
A2
cos
2
A2 sin(2t+ +2) 2
4
2
0
A2
cos
2
+ A2 4(0)
A2
cos
2
, a function only.
cos(2 + + 2)d
Therefore mean is a constant and Auto correlation function depends only and
so X(t) is a WSS process.
Solution
(a) The p.d.f of Y is f (Y ) =
The p.d.f of is f () =
1
2
1
2
RA
Y
A
1
2A
dY =
1 Y2 A
2A 2 A
1
4A
A2 A2 = 0 .
2 cos(t + ) cos(t + + )
2
E
2
2
2
cos +
2
E
2
2
2
cos +
2
4
2
2
cos +
2
8
sin(2t + + 2)
2
2
cos +
2
8
[0] =
cos(2t + + 2) + cos
Hence RXX ( ) =
cos(2t + + 2)
2
2
cos(2t + + 2) d
2
2
cos .
(c) Yes, X(t) is W.S.S process, because mean is constant and RXX ( ) is function
of only.
(at)n1
(1+at)
if n = 1, 2, 3, . . .
n+1
P (X(t) = n) =
at
,n=0
1+at
Show that it is not stationary.
Solution
P
E X(t) = n P (n) = 0
at
1+at
+1
1
(1+at)2
1
(1+at)2
1 + 2u + 3u2 + . . . , where u =
1
(1+at)2
2
1u
=
1
(1+at)2
at 2
1+at
16
(at)
at
+ 2 (1+at)
3 + 3 (1+at)4 + . . .
at
1+at
1+atat 2
1
2
(1+at)
1+at
= 1,
n2 P (n) =
1
(1+at)2
P
1
(1+at)2
P
2 n=1
1
(1+at)2
2 1
2(1+at)3
(1+at)2
n=1
n(n + 1)
n1
at
1+at
n1
n(n+1)
at
2
1+at
3
at
1+at
(1+at)2
(1+at)2
n=1 {n(n
n1
(at)
+ 1) n} (1+at)
n+1
n=1
n=1
2
at
,
1+at
n1
at
1+at
n1
at
1+at
since (1 x)3 =
n=1
n(n+1) n1
x
2
Example 1.4.9 Assume that a random process {X(t)} with four sample functions X(t, s1 ) = cos t ; X(t, s2 ) = cos t , X(t, s3 ) = sin t , X(t, s4 ) = sin t ,
all of which are equally likely. Show that it is WSS process.
Solution
E[X(t)] =
1
4
P4
i=1
1
4
P4
i=1
X(t1 , si ) X(t2 , si )
1
4
1
2
cos t1 cos t2 + sin t1 sin t2 = 21 cos(t2 t1 ) =
cos
2
, where = t2 t1
Since E[X(t)] and V ar[X(t)] are functions of time t , the Random process
{X(t)} is not stationary in any sense. It is evolutionary.
Auto covariance:
CXX (t1 , t2 ) = RXX (t1 , t2 ) E[{X(t1 )}] E[{X(t2 )}]
= 12 + t1 t2 22 + p2 + t1 t2 q 2 + pq(t1 + t2 ) (p + qt1 )(p + qt2 ) .
Therefore CXX (t1 , t2 ) = 12 + t1 t2 22 .
Example 1.4.11 If X(t) = Y cos t+Z sin t for all where Y and Z are independent binary random variables, each of which assumes the values 1 and 2 with
probabilities
2
3
and
1
3
2
if < <
f () =
0 otherwise.
R 1
(cos(t + )d
Now E[X(t)] = E(R)E[cos(t + )] = 2 2
= 1 [sin(t + )]
= 1 [sin(t + ) + sin( t)]
= 1 [ sin(t) + sin(t)] = 0
RXX (t, t + ) = E[X(t)X(t + )]
= E[R2 cos(t + ) cos(t + + )]
= E[R2 ]E[cos(t + ) cos(t + + )]
Since V ar(R) = 6 , we get E(R2 ) = V (R) + [E(R)]2 = 6 + 4 = 10
Therefore RXX (t, t + ) =
10
E[2 cos(t
2
+ ) cos(t + + )]
= 5E[cos(2t + + 2) + cos(t + )]
R
= 5 cos +
5
(2)
= 5 cos +
5
[sin(2t
(2)
cos(2t + + 2)d
+ + 2)] = 5 cos + 0
20
Solution:
E[X(t)] = aE[cos(t + )]
= aE E cos(t + )/}]
= aE[cos tE(cos ) sin tE(sin )]
1
)
Therefore E[X(t)] = aE[cos t ( 2
1
cos d sin t ( 2
)
sin d]
2
1
1
+ sin t1 sin t2 (2)
sin
sin
(t
1
+
t
)
sin 2 d
2
4
a2
E[cos t1
2
a2
E[cos (t1
2
21
Example 1.4.14 Show that the random process X(t) = A sin t + B cos t , where
A and B are independent random variables with zero means and equal standard
deviations is stationary of the second order.
Solution:
The A.C.F of a second order stationary process of a function of time difference
only and not on absolute time. Consider RXX (t, t + ) = E[X(t)X(t + )] .
Therefore RX X (t, t + ) = E[(A sin t + B cos t) (A sin(t + ) + B cos(t + ))]
= E[A2 ] sin(t+ ) sin t+E[B 2 ] cos(t+ ) cos t+E[AB] {sin t cos(t + ) + sin(t + ) cos t}
Given E(A) = E(B) = 0 E(AB) = 0 and E(A2 ) = E(B 2 ) = 2
Since V (A) = V (B) = 2 ,
Therefore RX X (t, t + ) = 2 [sin(t + ) sin t + cos(t + ) cos t]
= 2 cos(t + t)
= 2 cos
Therefore A.C.F is a function of time difference only. Hence {X(t)} is stationary of order two.
22
Solution:
E[Y (t)] = E[X(t)]E[cos(t + )]
n
o
R
1
= E[X(t)] (2)
cos(t
+
)d
= E[X(t)](0) = 0 .
RXX ( )
E[(cos )
2
Therefore RY Y (t, t + ) =
=
RX X ( )
2
cos +
RX X ( )
[cos
2
RX X ( )
(0)
2
RX X ( )
2
1
(2)
cos(2t + + 2)d]
cos
Example 1.4.16 Let X(t) = A cos t + B sin t , where A and B are independent normally distributed random variables N (0, 2 ) . Obtain the covariance
function of the process {X(t) : < t < } . Is {X(t)} covariance stationary?
Solution:
E[X(t)] = E[A] cos t + E[B] sin t
Since A and B are independent normal random variables N (0, 2 ) , E(A) =
E(B) = 0 and V (A) = V (B) = 2 .
and E(A2 ) = E(B 2 ) = 2 . Thus E[X(t)] = 0 .
23
E[B 2 ]
E[2 cos(50t
2
1 + ) cos(50t2 + )]
cos 50(t1 t2 )
2
cos 50(t1 t2 )
2
1
[sin(50t1
(8)
1
(4)
+ 50t2 + 2)] =
24
cos 50(t1 t2 )
2
1
3
for all i, compute the mean and a.C.F . Is tha process strict-sense
Solution:
Mean =
P3
i=1
P3
i=1
4 sin t
3
25
1.5
Solution:
RXX (t, t + ) = X(t)X(t + )
= E 10 cos(100t + 100 + ) 10 cos(100t + )
= 50E 2 cos(100t + 100 + ) cos(100t + )
= 50 cos(100 ) +
50
2
= 50 cos(100 ) +
50
4
cos(200t + 100 + 2) d
sin(200t + 100 + 2)
50
[0]
4
= 50 cos(100 )
RT
T
25
T
RT
T
X(t) X(t + ) dt
RT
T
25
T
sin(200t+100 +2) T
8T
RT
T
cos(200t + 100 + 2) dt
= 50 cos(100 )
26
Example 1.5.2 Prove that the random process {X(t)} defined by X(t) = A cos(t+
) where A and are constants and is uniformly distributed over (0, 2) is
ergodic in both the mean and the auto correlation function.
Solution:
Ensembled Mean and ACF
E[X(t)] =
A
2
R 2
0
cos(t + ) d =
A
2
2
sin(t + ) 0 =
A2
2
ACF RXX (t, t + ) = X(t)X(t + ) =
=
A2
2
E cos + cos(2t + + 2)
A2
2
cos +
A2
4
R 2
A2
2
cos +
A2
8
sin t sin t = 0
E 2 cos(t + + ) cos(t + )
cos(2t + + 2) d
sin(2t + + 2) d
Therefore RXX ( ) =
A
2
A2
2
2
0
A2
2
cos
cos
A
2T
RT
T
= limT
RT
T
X(t) dt
cos(t + ) dt
T
A sin(t+)
2T
1
2T
= 0.
Therefore XT = 0 .
Time averaged ACF x(t + )X(t) = limT
= limT
A2
2T
RT
= limT
A2
4T
RT
= limT
A2
4T
1
2T
RT
T
x(t + )X(t) dt
A2 cos(t + + ) cos(t + ) dt
{cos + cos(2t + + 2)} dt
A2
4T
27
RT
T
cos(2t + + 2) dt
A2
2
cos + limT
A2
2
cos
A2 sin(2t+ +2) T
2T
2
T
Therefore x(t + )X(t) =
A2
2
cos
Since ensembled mean=Time aveaged mean, {X(t)} is mean ergodic. Also since
ensembled ACF=Time averaged ACF, {X(t)} is correlation ergodic.
Example 1.5.3 {X(t)} is the random telegraph signal process with E[X(t)] = 0
and R( ) = e2| | . Find the mean and variance of the time average of {X(t)}
over (T, T ) . Is it mean ergodic?
Solution:
Mean of the time average of {X(t)}
XT =
1
2T
RT
T
X(t) dt
Therefore E[ XT ] =
RT
1
2T
To find V ar(XT )
V ar(XT ) =
=
1
T
R 2T
1
T
1
2T
1
T
2T
0
e2 d
e2 2T
2
0
| |
2T
1
2T 2
(1 e4 ) +
R 2T
1
T
1
2T
Therefore V ar(XT ) =
C( ) d =
2T
e2
2
e4T +
R 2T
0
2T
e2 d
e2 d
1
2T
1
T
e2 2T
42 0
1
82 T 2
1
82 T 2
(e4 1) .
(e4 1) .
28
1
if k is even
.
number of points in the interval (0, t) = k say and X(t) =
1 if k is odd
Find the ACF of X(t) . Also if P (A = 1) = P (A = 1) =
1
2
and A is
Solution:
Probability law of {X(t)} is given by P [X(t) = k] =
et (t)k
k!
, k = 1, 2, 3, . . .
t
2!
(t)2
4!
+ . . . = et cosh t .
t
1!
(t)3
3!
+ . . . = et sinh t .
1
2
1
2
=0
1
2
1
2
=1
30
Chapter 2
2.1
Basic Definitions
Example 2.1.2 The probability of raining today depends only on previous weather
31
conditions existed for the last two days and not on past weather conditions.
Definition 2.1.3 (Markov Chain) If the above condition is satisfied for all n ,
then he process {Xn }; n = 0, 1, 2 . . . is called a Markov chain and the constants
(a1 , a2 , . . . , an ) are called the states of the Markov chain. In other words, a discrete parameter Markov process is called a Markov Chain.
Definition 2.1.4 (One-Step Transition Probability) The conditional transition probability P [Xn = aj /Xn1 = ai ] is called the one-step transition probability from state ai to state aj at the nth step and is denoted by Pij (n 1, n) .
of the t.pm is 1 .
Definition 2.1.7 (n-Step Transition Probability) The conditional probability that the process is in state aj at step n , given that it was in state ai at step
32
(n)
Chapman-Kolmogorov theorem:
If P is the tpm of a homogeneous Markov chain,then n-step tpm P (n) is equal
(n)
(2)
(n)}
P (n) = {p1 , p2 . . . pk
and = (1 , 2 , 3 , . . . k )
2.2
Definition 2.2.1 If pij > 0 for some n and for all i and j , then every state
can be reached from every other state. When this condition is satisfied, the Markov
33
(n)
Definition 2.2.2 state i of a Markov chain is called a return state,i f pii > 0
for some n > 1 .
(m)
> 0}
Definition 2.2.4 (Recurrence time probability) The probability that the chain
returns to state i , starting from state i , for the first time at the nth step is called
the recurrence time probability or the first return time probability. It is denoted
by fii (n) .
(n)
n=1
(n)
n=1
(n)
nfii
Remark 2.2.6
35
2.3
Examples
Example
2.3.1 Consider
a Markov chain with transition probability matrix
P =
0.3 0.4 0.3 . Find the steady state probabilities of the system.
= (1 2 3 )
(1 2 3 )
0.3
0.4
0.3
(1)
(2)
(3)
(5)
(6)
(7)
(8)
(4)
2.1
= 0.3
(7)+(8) 6.21 =2.1 1 = 6.2
0.18
0.2
= 0.3
Example 2.3.2 At an intersection, a working traffic light will be out of order the
next day with probability 0.07 , and out of order traffic light will be working the
next day with probability 0.88 . Let Xn = 1 if a day n the traffic will work; Xn = 0
if on day n the traffic light will not work.Is {Xn ; n = 0, 1, 2 . . .} a Markov chain?.
If so, write the transition probability matrix.
Solution: Yes, {Xn ; n = 0, 1,
2 . . .} is a Markov
chain with state space {0, 1} .
0.12 0.88
.
Transition probability matrix
0.07 0.93
Example 2.3.3 Let {Xn } be a Markov chain with state space {0, 1, 2} with ini37
(0)
tial probability
vector P =
(0.7, 0.2, 0.1) and the one step transition probability
.
matrix P =
0.6
0.2
0.2
Solution: P (2)
=P =
0.6 0.2 0.2
0.6
0.2
0.2
P =
0.24
0.42
0.34
P3
i=1
(2)
(2)
(1)
(2)
1
6
1
6
+ 16 +
1
6
when i= 4,5,6
The transition
probabilitymatrix of the chain is
P =
1
6
2
6
1
1
1
1
6
6
6
6
3
1
1
1
6
6
6
6
4
1
1
0 6 6 6
0 0 56 16
0 0 0 1
1
6
1
6
1
6
1
6
40
3
6
1
P 2 = 36
3 5 7 9 11
4 5 7 9 11
0 9 7 9 11
0 0 16 9 11
0 0 0 25 11
0 0 0 0 36
P6
i=1
P6
1
6
1 1
(11
6 36
91
216
i=1
Pi62
+ 11 + 11 + 11 + 11 + 36)
Example 2.3.5 Three girls G1 , G2 , G3 are throwing ball to each other G1 always throws the ball to G2 and G2 always throws the ball to G3 , but G3 is just
41
as likely to throws the ball to G2 as to G1 . Prove that the process is Markovian.Find the transition matrix and classify the states.
Solution:
0 1 0
1 1 0
2
2
Xn2 , Xn3. . . or earlier
states.
{Xn } is a Markov chain.
Therefore
0 0 0
1
2
3
Now P =
1 1 0 , P = 0
2 2
0 1 1
1
2
(3)
(2)
(2)
1
2
1
2
1
4
1
2
1
2
(2)
(2)
(1)
P11 > 0 , P13 > 0 , P21 > 0 , P22 > 0 , P33 > 0 and all other Pij > 0 .
Therefore the chain is irreducible.
42
4
P =
1
4
1
4
(2)
1
2
1
4
1
2
(3)
1
1
4
2
,P5 =
1
1
4
2
1
1
4
(5)
(6)
1
4
1
2
3
8
1
1
4
2
and P 6 =
1
1
4
4
1
1
2
1
2
3
8
3
8
1
4
1 and so on.
2
3
8
Example
2.3.6
Find the nature of the states of the Markov chain with the tpm
0 1 0
P = 1
1 .
0
2
2
0 1 0
Solution :
43
0 1 0 0 1 0 1 0 1
2
2
P = 1
1 1
1 =
0
1
0
0
0
2
2 2
2
0 1 0 0 1 0 1 0 1
2
2
1 0 1 0 1 0 1 0 1
2
2
2
2
P 3 = P 2 .P =
0 1 0 1 0 1 0 1 0 = P
1 0 1 0 1 0 1 0 1
2
2
2
2
1 0 1 1 0 1 1 0 1
2
2
2 2
2
2
4
2
2
P = P .P =
0 1 0 0 1 0 = 0 1 0 = P and so on.
1 0 1 1 0 1 1 0 1
2
2
2
2
2
2
In general, P 2n = P 2 , P 2n+1 = P
(2)
(1)
(2)
44
(1)
(2)
(1)
(2)
(1)
(2)
(4)
(6)
Also Pii = Pii = Pii . . . > 0 , for all i, all the states of the chain are periodic,
with period 2. Since the chain is finite and irreducible, all its states are non-null
persistent. All states are not ergodic.
Example 2.3.7 A gambler has Rs 2. He bets Rs.1 at a time and wins Rs 1 with
probability
1
2
Solution: Let Xn represent the amount with the player at the end of the nth
round of the play. State space of {Xn } = (0, 1, 2, 3, 4, 5, 6) , as the game ends, if
the player loses all the money (Xn = 0) or wins Rs.4 that is has Rs.6 (Xn = 6) .
45
1
2
0 0 0 0 0
1
2
0 0 0
1
2
1
2
0 0
1
2
1
2
0 0
1
2
1
2
0 0 0
1
2
0 0 0 0 0
1
2
46
P (0)
1
2
=
0 0 1 0 0 0 0
0
0 0 0 0 0
1
2
0 0 0
1
2
1
2
0 0
1
2
1
2
0 0
1
2
1
2
0 0 0
1
2
0 0 0 0 0
47
=
0
0
1
2
1
2
1
2
0 0 0
1
2
0 0 0
0
P2 = PP =
0
1
2
1
2
0 0 0 0 0
1
2
0 0 0
1
2
1
2
1
2
1
2
0 0
1
2
1
2
0 0 0
1
2
0 0 0 0 0
similiary, P 2 = P 3) P =
1
4
1
4
3
8
1
8
P 4 = P 3P =
3 0
8
5
16
1
4
16
P (5) = P (4) P =
3
5
32
9
32
1
8
0 0
1
16
48
=
1 0
0
1
2
1
2
1
4
0 0
P (6) = P (5) P =
29 0
64
7
32
13
64
0 18
P (7) = P (6) P =
29
64
7
64
27
128
13
128
1
8
3
8
7
64
14+27+13
128
+0+
27
128
+0+
54
128
13
128
27
64
49
.
tivities on the next day is given by the tpm P =
0.40
0.10
0.50
( 1 2 3 )
0.40 0.10 0.50 = (1 2 3 )
. . . (1)
. . . (2)
. . . (3)
1 + 2 + 3 = 1
. . . (4)
. . . (5)
. . . (6)
. . . (7)
51
Example
2.3.9
3
4
P =
1
4
0
1
,
3
1
4
1
2
3
4
1
4
i = 0, 1, 2.
3
4
Solution: Given P =
1
4
1
4
1
2
3
4
1
4
1
4
52
P (2)
3
4
=
1
4
3
0
4
1 1
4 4
1
0
1
4
1
2
3
4
1
4
1
2
3
4
5
0
18
1 = 5
4
16
1
3
4
16
1
16
3 .
16
5
16
8
16
9
16
16
P2
i=0
5
18
2
P =
5
16
3
16
5
16
8
16
9
16
1
16
3
16
4
16
2
2
2
P [X2 = 2] = P02
P [X0 = 0] + P12
P [X0 = 1] + P22
P [X0 = 2]
1
+
= 13 [ 16
3
16
4
] = 61
16
53
3
4
(ii) P =
1
4
1
4
1
2
3
4
1
4
1
4
(1)
(1)
3131
4443
3
= 64
(2)
5
16
5 1
16 3
5
48
Example 2.3.10 There are 2 white balls in bag A and 3 red balls in bag B. At
each step of the process,a ball is selected from each bag and the 2 balls selected are
interchanged. Let the state ai of the system be the number of red ball in A after
i change. What is the probability that there are 2 red balls in A after 3 steps? In
the long run, what is the probability that there are 2 red balls in bag A?
Solution: State space of the chain {Xn } = (0, 1, 2) , since the number of balls
in the bag A is always 2. Let the transition probability matrix of the chain {Xn }
54
be P =
P
P
P
11
12
10
20 P21 P22
P00 = 0 [the state 6= 0 , interchange of balls]
P02 = P20 = 0 (After the process of interchanging,the number of red balls in bag
cannot increase or decrease by 2)
A 0 red balls(before interchange)
A 1 red balls(after interchange)
Let Xn =1, that is A contains 1 red ball(and 1 white ball) and B contains 1 white
and 2 red balls.
P {Xn+1 = 0/Xn = 1} = P10 = 12 13 =
P12 = 12 23 =
1
6
1
3
55
0 1 0
Therefore P = 1 1 1
6 2 3
0 2 1
3
5
P { there are 2 red balls in bag A after 3 steps } = P {X3 = 2} = P2 = 18
. Let
0 1
0 1 0
1
1 =
1
2 6 2 3
0 1 2
0 2 1
3
1
6
= 0
0 +
1
2
22
3
= 1
56
1
3
2
3
= 2
Therefore 2 = 61 ;6 1 +3 2 +4 3 =6 2 and 2 =2 3
Therefore 3 1 = 3 ; 2 =2 3 ; 1 + 2 + 3 = 1
3 13 + 23 + 3 = 1 103 = 3
Therefore 3 =
3
10
, 2 =
1
,
distribution is = ( 10
6
10
and 1 =
6
,
10
3
10
1
10
2.4
Exercise
57
1 0
.
space S = [0, 1] and one-step TPM P =
1 1
2
7. At an intersection, a working traffic light will be out of order the next day
with probability 0.07, and out of order traffic light will be working the next day
with probability 0.88. Let Xn = 1 if a day n the traffic will work; Xn =0 if on
day n the traffic light will not work. Is {Xn ; n = 0, 1, 2 . . .} a Markov chain?. If
so, write the transition probability matrix.
8. The
tpm ofa Markov chain with three states 0,1,2 is
3
4
P =
1
4
1
4
1
2
3
4
1
4
58
59
Chapter 3
Poisson Process
3.1
Basic Definitions
Pn (t+t)Pn (t)
t
Taking limit as t 0,
limt0
Pn (t+t)Pn (t)
t
Pn (t) =
n
[ntn1 f (t)
n!
+ tn f 0 (t)] (3)
(t)n
f (t)
n!
-(2)
We get
n
[ntn1 f (t)
n!
(t)n 0
f (t)
n!
Therefore
n1
+ tn f 0 (t)] = (t)
f (t) (t)
f (t)
(n1)!
n!
n
= (t)
f (t)
n!
f 0 (t) = f (t)
f 0 (t)
f (t)
Therefore
et(t)n
,n
n!
= 0, 1, 2...
This is the probability law for Poisson process. It is to be observed that the
probability distribution of X(t) is the Poisson distribution with parameter t .
n=0
n=0
nPn (t)
n(t)n t
e
n!
tn
n=0 n1!
= et
= et
t
1
2 t2
1!
h
= tet 1 +
t
1!
3 t3
2!
2 t2
2!
i
+ ...
i
+ ...
= tet et = t
62
n=0
n2 Pn (t) =
n=0
n2 et(t)
n!
Now n2 = n(n + 1) + n
Hence E[X 2 (t)] =
=
n=0
= et
n=0
n(n1)et (t)n
n!
[n(n1)+n]et (t)n
n!
n=0
net (t)n
n!
n=0
2
(t)
(n2)!
+ t
)
+
= et [ (t)
0!
(t)3
1!
+ ...] + t
= et 2 t2 et + t
= 2 t2 + t
Hence V ar[X(t)] = 2 t2 + t (t)2 = 2 t2 + t 2 t2 = t
3.2
= P [X(t3 ) = n3 /X(t2 ) = n2 ]
This means that the conditional probability distribution of X(t3 ) given all the
past values X(t1 ) = n1 , X(t2 ) = n2 depends only on the most recent value
X(t2 ) = n2 . Therefore Poisson processes the Markovian property. Hence Poisson
process is a Markov Process.
Property 2: The sum of independent Poisson processes is a Poisson process.
Proof:
P [X(t) = n] =
Pn
64
Pn
k=0
e1 t (1 t)k e2 t (2 t)nk
k!
(nk)!
=
P [X(t) = n] =
e(1 +2 )t
n!
Pn
n!
k=0 k! (nk)!
e(1 +2 )t
(1 t
n!
(1 t)k (2 t)nk
+ 2 t)n , n = 0, 1, 2, . . .
65
(1)
Property 4: The inter arrival time of a Poisson process i.e., the interval between
two successive occurrences of a Poisson process with parameter has an exponential distribution with mean
Proof:
Let two consecutive occurrences of the event be Ei and Ei+1
Let Ei take place at time instant ti and T be the interval between the occurrences of Ei and Ei+1 .
T is a continuous random variable.
P [T > t] = P {Ei did not occur in (ti , t+1 )}
= P [ no event occurs in an interval of length t]
= P [X(t) = 0] = et
Therefore cumulative distribution function of T is given by
F (t) = P [T t] = 1 P [T < t] = 1 et
Therefore the p.df of T is the exponential distribution with parameter , given
by f (t) = et
(t 0) .
66
3.3
ept (pt)n
,
n!
n = 0, 1, 2, . . .
Example
Example 3.3.1 If {N1 (t)} and {N2 (t)} are two independent Poisson process
with parameters 1 and 2 respectively, show that P [N1 (t) = k/N1 (t) + N2 (t) =
n] nCr pk q nk where p =
1
1 +2
and
2
1 +2
.
e1 t (1 t)r
r!
, r = 0, 1, 2 . . . and
= nCk (11+2 2 )n
1
2
= nCk ( 1+
)k ( 1+
)nk
2
2
Taking p =
1
1 +2
, q =1p=
2
1 +2
, we have
1
3
of
being recorded. Find the probability that at least 5 particles are recorded in a 5
minute period.
Solution: Let N(t)be the number of recorded particles. Then {N (t)} is a Poisson process with p as parameter. Now, p = 6( 31 ) = 2
P [N (t) = n] =
e2t (2t)n
n!
, n = 0, 1, 2 . . .
102
2!
103
3!
104
]
4!
= 0.9707 .
1. P (T > 1) = 3
R
1
e3t dt
= 3 e33t 1
= e3
2. P (1 < T < 2) =
3. P (T 4) = 3
R4
0
R2
1
2
3e3t dt = 3 e33t 1 = e3 e6
e3t dt
4
= 3 e33t 0
= 1 [e12 e0 ] = 1 e12 .
Example 3.3.4 A machine goes out of order, whenever a component fails. The
failure of this part follows a Poisson process with a mean rate of 1 per week. Find
the probability that 2 weeks have elapsed since last failure. If there are 5 spare
parts of this component in an inventory and that the next supply is not due in 10
weeks, find the probability that the machine will not be out of order in the next
10 weeks.
69
2(20 )
0!
= e2 = 0.135
2. There are only 5 spare parts and the machine should not go out of order in
the next 10 weeks.
Therefore P[for this event]= P [X(10) 5]
=
e10 (10)n
n!
P5
n=0
= e10 [1 + 10 +
102
2!
103
3!
104
4!
105
]
5!
= 0.068 .
Example 3.3.5 What will be the superposition of n independent Poisson processes with respective average rates 1 , 2 , . . . , n .
Solution:
P [X(s) = r/X(t) = n] =
=
P [X(s)=rX(t)=n]
P [X(t)=n]
P [X(s)=rX(ts)=nr]
P [X(t)=n]
70
P [X(s)=r]P [X(ts)=nr]
P [X(t)=n]
r!
}{ e
(ts) [(ts)]nr
sr (ts)nr
n!
r!(nr)!
tn
nr
n!
( s )r ( ts
)
r!(nr)! t
t
(nr)!
} etn!(t)n
Example 3.3.7 A fisherman catches fish at a Poisson rate of 2 per hour from a
large lake with lots of fish.If he starts fishing at 10:00am,what is the probability
that he catches one fish by 10:30 am and 3 fish by noon?
Solution: Let N (t) be the total number of fish caught at or before time 0 t0 .
Then N (0) ; {N (t) : t 0} has stationary and independent increments.
Therefore {N (t) : t 0} is a Poisson process. Here = 2
P [N (t) = n] =
e2t (2t)n
n!
, n = 0, 1, 2 . . .
P [N ( 21 ) = 1 and N ( 32 ) = 2] = P [N ( 21 ) = 1]P [N ( 32 ) = 2]
1
= { e1! }{ e
3 32
2!
= 0.082
71
3.4
Now we will go to the random process that has wide applications in several fields
of natural phenomena such as spread of queuing problems, telephone exange,
traffic maintenance, epidemics, and population growth. The name of the process
is birth and death process.
Therefore
Pn (t + t) = Pn (t)(1 n t)(1 n t) + Pn1 (t)n1 t(1 n1 t)
+ Pn+1 (t)(1 n+1 t)n+1 t + Pn (t)n (t)tn (t)t
Pn (t + t) = Pn (t) (n + n )Pn (t)t + n1 Pn1 (t)t + n+1 Pn+1 (t)t
Omitting terms containing (t)2
73
Pn (t+t)Pn (t)
t
(1)
Pn (t+t)Pn (t)
t
(2)
= P0 (t)0 + P1 (t)1
(3)
P0 (t+t)P0 (t)
t
(4)
74
n 0 , the
(5)
0 P0 = 1 P1
and
(6)
These equations (5) and (6) are known as the balance equations.
By re-arranging the equation(5), we get
n Pn n+1 Pn+1 = n1 Pn1 n Pn = . . . = 0 P0 1 P1
But from (6), 0 P0 = 1 P1 P1 =
0
P
1 0
n1
Pn1
n
1
P
2 1
Proceeding, Pn =
2
P
3 2
0 1 2
P
1 2 3 0
P3 =
0 1 2 ...n1
P0
1 2 3 ...n
Pn = P0
P
1 0
P
2 1 0
0 1
P
1 2 0
P3 =
Since
P2 =
Qn1
i
i=0 i+1
, n1
n0 Pn = 1 , we have P0 [1 +
Therefore P0 =
(8)
P Qn1
n=1
1
P
Qn1 i
1+
)
n=1
i=0 (
i
i=0 i+1 ] =1.
(9)
i+1
75
(7)
=0.
Therefore the steady state solutions of the Birth-Death process are given by
Pn (t) = P0 (t)
P0 (t) =
Qn1
i
i=0 i+1
, n 1 and
1
Qn1 i
P
1+
)
n=1
i=0 (
i+1
3.5
P Qn1
n=1
i
i=0 i+1
converges.
Exercise
(b)0.1328
(c)0.0183
76
(d)0.1353
Ans:(c)
4. A birth and death process is a
(a)Ergodic
(b)finite Markovian
(c)Stationary
(d)None
Ans:(b)
5. Which one of the following is Poisson process
(a)The arrival of a customer at a bank (b)Random walk with reflecting barriers (c)The time duration between consecutive cars passing a fixed location
on a road.
Ans:(a)
78