ch7 markov chains part ii
TRANSCRIPT
-
7/27/2019 Ch7 Markov Chains Part II
1/22
1
241-460 Introduction to Queueing
Networks : Engineering Approach
Assoc. Prof. Thossaporn Kamolphiwong
Centre for Network Research (CNR)
Department of Computer Engineering, Faculty of Engineering
Prince of Songkla University, Thailand
ap er ar ov a ns
Email : [email protected]
Outline
Markov Chains Part II
Birth and Death Process
Transition Matrix
State Transition Steady State
Flow Based approach
Example
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
2/22
2
Birth-Death Process
Special case of a Markov process in which
neighboring statek+1,k andk-1
210 0 21
Chapter 7 : Markov Chains
BirthBirth--Death ProcessDeath Process
Birth-Death Process
Continuous-time Markov chain [X(t)|t> 0] with thestate 0 1 2 is known as birth-deathprocess if there exist constants k(k= 0,1,)and k(k= 1, 2, ) such that the transition rates
are given by
qk,k+1 = k
k,k-1 k
qkj = 0 for |kj| > 1
qkk= 1-(k+ k)
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
3/22
3
Birth Death Process Example
0 1 k-1
1 2 k
0 1 2 k-1 k
Chapter 7 : Markov Chains
Birth-Death Process
Birth rate k is the rate at which births occurwhen the o ulation is of size k
Death rate k is the rate at which deaths occurwhen the population is of size k
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
4/22
4
Transition matrix
0 1 k-1
q00 q01 q02 q0k
1 2 k
0 1 2 k k -1
q10 q11 q12 q=
Chapter 7 : Markov Chains
qk0 qk1 qk2 qkk
0 1 k-1
Transition matrix
0 1 k-1
1 2 k1 2 k
1-0 0 0
1 1-(1+1) 1
0 0
00=
0 1 2 k k -1
Chapter 7 : Markov Chains
0 21-(2+ 2)2 0
0 0 3 1-(3+3) 3
-
7/27/2019 Ch7 Markov Chains Part II
5/22
5
State Distribution
k-11 k0 2 k-2
LetX(t) : # of customer in the system at time t
Pk(t) : prob. of finding system in state kat time t
k-12 k10 k+1
k-11 k2 k+13
Then
Pk(t) =P[X(t) = k]
Chapter 7 : Markov Chains
State Transition
Consider statek at time t+ t
occurred
k 1 in the population at time tand we had abirth during the interval (t, t + t)
k+ 1 members in the population at time tand
had one death during the interval (t, t + t)
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
6/22
6
State Transition
Deathk+1
No change
Birth
kk
k-1
time
Chapter 7 : Markov Chains
t+tt
Pk(t+t) = Pk-1(t)pk-1,k(t)+Pk(t)pk,k(t) +Pk+1(t)pk+1,k(t)
(Continue)
Let k = birth rate in state k
=
Then
P[state kto state k 1 in t] = ktP[state kto state k+ 1 in t] = kt
P[state kto state kin t] = 1 (k+ k)t
P[state kto other state in t] = 0
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
7/22
7
State Transition
k-1 k10 k+1
k-1 k0 2
k k+1
k= 0
P0(t+t) = P0(t)p00(t) + P1(t)p10(t)
Pk(t+t) = Pk(t)pk,k(t)+Pk-1(t)pk-1,k(t)+Pk+1(t)pk+1,k(t)
2
= P0(t)[1-(0 + 0)t] + P1(t)1t= P0(t)[1-0t] + P1(t)1t
= P0(t) - 0tP0(t) + 1tP1(t)
Chapter 7 : Markov Chains
(Continue)
k= 0
P0(t+t) = P0(t) - 0tP0(t) + 1tP1(t)
1100
00 tPtPtPttP
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
8/22
8
(Continue)
k-1 k10 k+1
k-1 k0 2
k k+1
k> 1
Pk(t+t)=Pk(t)[1-(k+ k)t]+Pk-1(t)k-1t+ Pk+1(t)k+1t
Pk(t+t) = Pk(t)pk,k(t)+Pk-1(t)pk-1,k(t)+Pk+1(t)pk+1,k(t)2
Chapter 7 : Markov Chains
= Pk(t) - (k+ k)tPk(t) +k-1t Pk-1(t) +k+1tPk+1(t)
(Continue)
k> 1
Pk(t+t) = Pk(t)-(k+k)tPk(t)+k-1t Pk-1(t) +k+1tPk+1(t)
11 tPtPt
tPttPkkkkk
kk
Chapter 7 : Markov Chains
k-1
k-1
k
k
kk+1k+1
-
7/27/2019 Ch7 Markov Chains Part II
9/22
9
(Continue)
lim
0 dt
tdP
t
tPttP kkk
t
Chapter 7 : Markov Chains
k= 0
(Continue)
k> 1
tPtP
dt
tdP1100
0
Chapter 7 : Markov Chains
tPtPtPdt
tPkkkkkkk
k1111
-
7/27/2019 Ch7 Markov Chains Part II
10/22
10
Steady State
0
tdPkIn steady state,.
-0P0 + 1P1 = 0 k= 0
-(k+ k)Pk+ k-1Pk-1 + k+1Pk+1 = 0 k> 1
Chapter 7 : Markov Chains
k-12 k
-
k-1
10
1
k+1
k2 k+1
-
3
Steady State
Re-arranging,
1P1 = 0P0 k= 0
k-1Pk-1 + k+1Pk+1 = kPk+ kPk k> 1
-
k-1 k
+0
0
1
Chapter 7 : Markov Chains
Flow rate in to k = Flow rate out of k
k k+1
-
7/27/2019 Ch7 Markov Chains Part II
11/22
11
Flow-Based Method
k-1 kk-2
Flow rate into state k= k-1Pk-1(t) + k+1Pk+1(t)
k-1 k
k-1
k+1
k k+1
Flow rate out of state k= (k+ k)Pk(t)
Effective probability flow rate at k = Flow intostate k Flow out of state k
Chapter 7 : Markov Chains
Flow-Based Approach
A flow-Based Approachis the way of solvingproblem. This approach would be usable if thearrival process is a Poisson process and theservice process has exponentially distributed
services times. Step for flow-based approach
(1) Draw the state transition diagram
boundary.
(3) Solve the equation in (2) to obtain the equilibriumstate probability distribution
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
12/22
12
Flow Balance Equations
Draw a closed boundary aroundstate k
k-1 k
k
jk
jkk
jk
kjk pPpP
Global Balance Equationk
k k+1
Draw a closed boundary between
k k+1
k+1
Chapter 7 : Markov Chains
state kand state k+ 1 Detailed Balance Equation
pk,k+1Pk= pk+1,kPk+1
Detailed Balance Equation
Detailed balance equation lead to
1
1
kk
k
k PP
k-1 k-1 = k k = , , ,
,...3,2,11
1
00
kPP ii
k
ik
11
kP
Chapter 7 : Markov Chains
1 1
1
0
0
1k i
ik
i
P
1k
The probability of thesystem being empty
-
7/27/2019 Ch7 Markov Chains Part II
13/22
13
Pure Birth system
k-11 k0 2 k-2
Let k= 0 for all k
k= for all k= 0, 1, 2,
k-12 k10 k+1
0
0
0
10
k
kPk
Chapter 7 : Markov Chains
T e system eg n at t me 0 w t 0 mem er
Pure Birth System
11 ktPtP
tdPkk
k
tPtPtP
dt
tdPkkkkkkk
k 1111
00 ktP
dt
tdPk
Chapter 7 : Markov Chains
Solution forP0(t)
P0(t) = e-t
-
7/27/2019 Ch7 Markov Chains Part II
14/22
14
Pure Birth System
For k=1
Solution P1(t) = te-t
tPtPdt
tdP10
1
tPe
dt
tdP t1
1
0,0!
tkek
ttP t
k
k
Chapter 7 : Markov Chains
or > 0, t> 0
Solution
(Continue)
t
k
k et
tP Poisson Distribution
Pk(t) is probability that karrivals occur during thetime interval (0,t)
is average rate at which customers arrive
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
15/22
15
Pure Death System
k-12 k10 k+1
Letk= for all k= 0, 1, 2, ,N
k= 0 for all k
k-11 k2 k+13
The system begin atNmember
Chapter 7 : Markov Chains
Pure Death System
NktPtPtdP
kkkkk 011
t
kN
Erlang Distribution
NktP
dt
tdPN
N 1
01
0 ktPdt
tdP
0!1
0!
1
0
keN
t
dt
tdP
NekN
tP
t
N
k
Chapter 7 : Markov Chains
So ution orPk(t)
-
7/27/2019 Ch7 Markov Chains Part II
16/22
16
Birth-Death process Example
An ticket reservation system has 2 computers, one
on-line and one standb . The o eratincomputer fails after an exponentially distributedtime, having mean tfand then it is replaced bystandby computer, i.e., at any time only 1 or 0
computers are operating. There is one repairfacility, i.e., at any time only 1 or 0 computers
can be repaired. The repair times areexponentially distributed with mean tr. What
fraction of the time will the system be down?(that is, both computers failed)
Chapter 7 : Markov Chains
What fraction of the time will the system be down?
Solution
,
State = # of working (not failed) computer
1/tr 1/tr P0 : prob.thatbothcomputershavefailed.
Chapter 7 : Markov Chains
1/tf 1/tf
21
-
7/27/2019 Ch7 Markov Chains Part II
17/22
17
Solution
1/tr 1/tr
State Rate In = Rate Out
0 (1/tf)P1 = (1/tr)P0
+ = +
1/tf 1/tf
r
r
(1/tr)P0 + (1/tf)P2 = (1/tr)P1 + (1/tr)P0
(1/tf)P2 = (1/tr)P1
2 (1/tr)P1 = (1/tf)P2
Chapter 7 : Markov Chains
Finding Steady State Process
Solution
(1/tf)P1 = (1/tr)P0
(1/tf)P2 = (1/tr)P1
(1/tf)(1/tf)P1P2 = (1/tr)(1/tr)P0P1
P2 = (tf/tr)2P0
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
18/22
18
2
Solution
1
0nn
1210 PPP
22
20
1
t
t
t
t
t
r
f
r
f
Chapter 7 : Markov Chains
1000
Pt
Pt
Pr
f
r
f 22
f
r
tttt frr
Example
A gasoline station has only one pump. Cars arriveat a rate of 20 hour. However if the um isalready in use, these potential customers make'balk', i.e. drive on to another gasoline station. If
there are n cars already at the station theprobability that an arriving car will balk is n/4 forn = 1, 2, 3, 4, and 1 for n > 4. Time required to
service a car is exponentially distributed withmean 3 min.
What is the probability of no cars in gasolinestation?
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
19/22
19
(Continue)
Cars arrive at a rate of 20/hour. If there are n
cars alread at the station the robabilit that anarriving car will balk is n/4 for n = 1, 2, 3, 4, and 1for n > 4.
0 = 20, 1 = 20, 2 = 20, 3 = 20 per hour
n = 0 when n 0, 1, 2, 3
distributed with mean 3 min.
n = (1/3)60 = 20/hour for all n
Chapter 7 : Markov Chains
Solution
20 4
320
4
120
4
220
Rate In = Rate Out
=
20 20 20 20
0 1 2 3 4
14
nP
Chapter 7 : Markov Chains
20(3/4)P1 = 20P220(2/4)P2 = 20P320(1/4)P3 = 20P4
0n
P0 +P1 +P2 +P3 +P4 = 1
-
7/27/2019 Ch7 Markov Chains Part II
20/22
20
Solution
Rate In = Rate Out
20P = 20PP4 = (3/4)(2/4)(1/4)P0
20(3/4)P1 = 20P220(2/4)P2 = 20P320(1/4)P3 = 20P4
= (3/32)P0
P1 =P0
P2 = (3/4)P0
Chapter 7 : Markov Chains
P3 = (3/8)P0
P0 +P1 +P2 +P3 +P4 = 1
Solution
P0 +P0 +(3/4)P0 +(3/8)P0 + (3/32)P0 = 1
P0 = 32/103 = 0.31068P1 = 0.31068
=2 .
P3 = 0.116505
P4 = 0.029126
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
21/22
21
References
1. Alberto Leon-Garcia, Probability and RandomProcesses for Electrical En ineerin Addision-Wesley Publishing, 1994
2. Roy D. Yates, David J. Goodman, Probabilityand Stochastic Processes: A FriendlyIntroduction for Electrical and ComputerEngineering, 2nd, John Wiley & Sons, Inc, 2005
3. Jay L. Devore, Probability and Statistics forEngineering and the Sciences, 3rdedition, Brooks/Cole PublishingCompany, USA, 1991.
Chapter 7 : Markov Chains
(Continue)
4. Robert B. Cooper, Introduction to QueueingTheor 2nd edition North Holland 1981.
5. Donald Gross, Carl M. Harris, Fundamentals ofQueueing Theory, 3rd edition, Wiley-Interscience Publication, USA, 1998.
6. Leonard Kleinrock, Queueing Systems Volumn-
Canada, 1975.
Chapter 7 : Markov Chains
-
7/27/2019 Ch7 Markov Chains Part II
22/22
(Continue)
7. Georges Fiche and Gerard Hebuterne,Communicatin S stems & Networks: Traffic &Performance, Kogan Page Limited, 2004.
8. Jerimeah F. Hayes, Thimma V. J. Ganesh Babu,Modeling and Analysis of TelecommunicationsNetworks, John Wiley & Sons, 2004.
Chapter 7 : Markov Chains