Tải bản đầy đủ (.pdf) (41 trang)

Networking Theory and Fundamentals - Lecture 4 & 5 doc

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (551.09 KB, 41 trang )

1
TCOM 501:
Networking Theory & Fundamentals
Lectures 4 & 5
February 5 and 12, 2003
Prof. Yannis A. Korilis
4&5-2
Topics
 Markov Chains
 M/M/1 Queue
 Poisson Arrivals See Time Averages
 M/M/* Queues
 Introduction to Sojourn Times
4&5-3
The M/M/1 Queue
 Arrival process: Poisson with rate λ
 Service times: iid, exponential with parameter µ
 Service times and interarrival times: independent
 Single server
 Infinite waiting room
 N(t): Number of customers in system at time t (state)
0 1 n+1n2
λ
µ
λ
µ
λ
µ
λ
µ
4&5-4


Exponential Random Variables
 Proof:
00
00
00
0
()
00
{} (,)
(1 )
()
1
y
XY
y
xy
y
yx
yy
yy
P X Y f x y dx dy
e e dx dy
e e dx dy
eedy
ed
y
ed
y
λµ
µλ

µλ
µλµ
λµ
µλ
µ
µ
µλµ
λµ
µλ
λµ λµ


−−

−−

−−
∞∞
−−+
<= =
=⋅ =
==
=−=
=
−+ =
+
=− =
++
∫∫
∫∫

∫∫

∫∫
()
()
{
min
{
,
}} {
,
}
{}{}
{min{ , } } 1
tt t
t
PXYtPXtYt
PX tPY t
ee e
PXYte
λµ λµ
λµ
−− −+
−+
>= > >=
=
>>=
=
=⇒
≤=−

 X: exponential RV with
parameter λ
 Y: exponential RV with
parameter µ
 X, Y: independent
Then:
1. min{X, Y}: exponential RV
with parameter λ+µ
2. P{X<Y} = λ/(λ+µ)
[Exercise 3.12]
4&5-5
M/M/1 Queue: Markov Chain Formulation
 Jumps of {N(t): t ≥ 0} triggered by arrivals and departures
{N(t): t ≥ 0} can jump only between neighboring states
Assume process at time t is in state i: N(t) = i ≥ 1
 X
i
: time until the next arrival – exponential with parameter λ
 Y
i
: time until the next departure – exponential with parameter µ
 T
i
=min{X
i
,Y
i
}: time process spends at state i
 T
i

: exponential with parameter ν
i
= λ+µ
 P
i,i+1
=P{X
i
< Y
i
}= λ/(λ+µ), P
i,i-1
=P{Y
i
< X
i
}= µ/(λ+µ)
 P
01
=1, and T
0
is exponential with parameter λ
{N(t): t ≥ 0} is a continuous-time Markov chain with
,1 ,1
,1 ,1
,0
,1
0, | | 1
ii i ii
ii i ii
ij

qP i
qP i
qij
ν
λ
νµ
++
−−
=
=≥
=
=≥
=−>
4&5-6
M/M/1 Queue: Stationary Distribution
0 1
n+1n2
λ
µ
λ
µ
λ
µ
λ
µ
 Birth-death process → DBE
 Normalization constant
 Stationary distribution
1
11 0


nn
n
nn n
pp
pp p p
µ
λ
λ
ρρ
µ

−−
=⇒
====
00
01
11 1 1, if 1
n
n
nn
pp p
ρρρ
∞∞
==

=⇔ + =⇔ =− <


∑∑

(1 ), 0,1,
n
n
pn
ρρ
=− =
4&5-7
The M/M/1 Queue
 Average number of customers in system
 Little’s Theorem: average time in system
 Average waiting time and number of customers in the
queue – excluding service
1
00 0
2
(1 ) (1 )
1
(1 )
(1 ) 1
nn
n
nn n
Nnp n n
N
ρ
ρρρρ
ρλ
ρρ
ρρµλ
∞∞ ∞


== =
==− =−
⇒= − = =
−−−
∑∑ ∑
λµλµ
λ
λλ

=

==
11N
T
ρ
ρ
λ
λµ
ρ
µ

==

=−=
1
and
1
2
WNTW

Q
4&5-8
The M/M/1 Queue
0
2
4
6
8
10
0 0.2 0.4 0.6 0.8 1
ρ
N
 ρ=λ/µ: utilization factor
 Long term proportion of
time that server is busy
 ρ=1-p
0
: holds for any
M/G/1 queue
 Stability condition: ρ<1
Arrival rate should be less
than the service rate
4&5-9
M/M/1 Queue: Discrete-Time Approach
 Focus on times 0, δ, 2δ,… (δ arbitrarily small)
 Study discrete time process N
k
= N(δk)
Show that transition probabilities are
 Discrete time Markov chain, omitting o(δ)

lim { ( ) } lim { }
k
tk
PNt n PN n
→∞ →∞
== =
00
,1
,1
1()
1(), 1
( ), 0
( ), 0
( ),
||
1
ii
ii
ii
ij
P
Pi
Pi
Pi
P
i
j
λ
δοδ
λδ µδ ο δ

λδ ο δ
µδ ο δ
οδ
+

=− +
=− − + ≥
=+ ≥
=+ ≥
=−>
0 1
n+1n2
λ
δ
µ
δ
λ
δ
λ
δ
µ
δ
λ
δ
µ
δ
µ
δ
1
λ

δµδ
−− 1
λ
δµδ

− 1
λ
δµδ

−1
λ
δ

4&5-10
M/M/1 Queue: Discrete-Time Approach
0 1
n+1n2
λ
δ
µ
δ
λ
δ
λ
δ
µ
δ
λ
δ
µ

δ
µ
δ
1
λ
δµδ
−− 1
λ
δµδ

− 1
λ
δµδ

−1
λ
δ

 Discrete-time birth-death process → DBE:
 Taking the limit δ→0:
Done!
1
10
[()]π [()]π
() ()
ππ π
() ()
nn
n
nn

µ
δοδ λδοδ
λδ οδ λδ οδ
µδ ο δ µδ ο δ


+=+ ⇒


++
===


++


"
00
00 0
()
lim π lim lim π
()
nn
nn
p
p
δδ δ
λδ ο δ λ
µδ ο δ µ
→→ →

 
+
=⇒=


+
 
4&5-11
Transition Probabilities?
 A
k
: number of customers that arrive in I
k
=(k
δ
, (k+1)
δ
]
 D
k
: number of customers that depart in I
k
=(k
δ
, (k+1)
δ
]
 Transition probabilities P
ij
depend on conditional probabilities:

Q(a,d | n) = P{A
k
=a, D
k
=d | N
k-1
=n}
 Calculate Q(a,d | n) using arrival and departure statistics
 Use Taylor expansion e
-λδ
=1-λδ+o(δ), e
-µδ
=1-µδ+o(δ), to express as
a function of δ
 Poisson arrivals: P{A
k
≥ 2}=o(δ)
 Probability there are more than 1 arrivals in I
k
is o(δ)
 Show: probability of more than one event (arrival or departure)
in I
k
is o(δ)
☺ See details in textbook
4&5-12
Example: Slowing Down
λ
µ
/ m

λ
/ m
µ
/
11/
/
(/)
//
NN
Nm
TmT
m
m
WmW
mm
ρ
λµ λ
ρλµµλ
λµλ
ρλµ
µλ µλ


== ==

−− −


== =




===
−−
/
11/
1
/
N
N
T
W
ρ
λµ λ
ρ
λµ µ λ
λµλ
ρλµ
µλ µλ
== =
−− −
==

==
−−
 M/M/1 system: slow down the arrival and service rates by the
same factor m
 Utilization factors are the same ⇒stationary distributions the
same, average number in the system the same
 Delay in the slower system is m times higher

 Average number in queue is the same, but in the 1st system the
customers move out faster
4&5-13
Example: Statistical MUX-ing vs. TDM
/ m
λ
/ m
µ
λ
µ
m
/ m
λ
/ m
µ
m
TmT
µλ

==

 m identical Poisson streams with rate λ/m; link with capacity 1;
packet lengths iid, exponential with mean 1/µ
 Alternative: split the link to m channels with capacity 1/m each,
and dedicate one channel to each traffic stream
 Delay in each “queue” becomes m times higher
Statistical multiplexing vs. TDM or FDM
When is TDM or FDM preferred over statistical multiplexing?
4&5-14
“PASTA” Theorem

 Markov chain: “stationary” or “in steady-state:”
 Process started at the stationary distribution, or
 Process runs for an infinite time t→∞
Probability that at any time t, process is in state i is
equal to the stationary probability
 Question: For an M/M/1 queue: given t is an arrival
time, what is the probability that N(t)=i?
Answer: Poisson Arrivals See Time Averages!
()
lim { ( ) } lim
i
i
tt
Tt
pPNti
t
→∞ →∞
===
4&5-15
PASTA Theorem
 Steady-state probabilities:
 Steady-state probabilities upon arrival:
 Lack of Anticipation Assumption (LAA): Future inter-arrival times
and service times of previously arrived customers are independent
Theorem: In a queueing system satisfying LAA:
1. If the arrival process is Poisson:
2. Poisson is the only process with this property
(necessary and sufficient condition)
lim
{

()
}
n
t
pPNtn
→∞
==
lim { ( )
|
arrival at
}
n
t
aPNntt
→∞

==
,0,1,
nn
apn
=
=
4&5-16
PASTA Theorem
Doesn’t PASTA apply for all arrival processes?
 Deterministic arrivals every 10 sec
 Deterministic service times 9 sec
Upon arrival: system is always empty a
1
=0

Average time with one customer in system: p
1
=0.9
 “Customer” averages need not be time averages
 Randomization does not help, unless Poisson!
1
0
10
9 20 30 30 39
4&5-17
PASTA Theorem: Proof
 Define A(t,t+δ), the event that an arrival occurs in [t, t+ δ)
 Given that a customer arrives at t, probability of finding the
system in state n:
 A(t,t+δ) is independent of the state before time t, N(t
-
)
 N(t
-
) determined by arrival times <t, and corresponding service times
 A(t,t+δ) independent of arrivals <t [Poisson]
 A(t,t+δ) independent of service times of customers arrived <t [LAA]
00
0
{() ,(, )}
( ) lim { ( ) | ( , )} lim
{(, )}
{() }{(, )}
lim { ( ) }
{(, )}

n
PNt nAtt
at PNt nAtt
PAtt
PNt nPAtt
PNt n
PAtt
δδ
δ
δ
δ
δ
δ
δ


→→



=+
==+=
+
=+
===
+
lim ( ) lim { ( ) }
nn n
tt
aatPNtnp


→∞ →∞
== ==
0
{ ( ) | arrival at t} lim { ( ) | ( , )}PNt n PNt n Att
δ
δ
−−

===+
4&5-18
PASTA Theorem: Intuitive Proof

t
a
and
t
r
: randomly selected arrival and observation times,
respectively
 The arrival processes prior to
t
a
and
t
r
respectively are
stochastically
identical
 The probability distributions of the time to the first arrival before

t
a
and
t
r
are
both
exponentially distributed with parameter λ
 Extending this to the 2nd, 3rd, etc. arrivals before
t
a
and
t
r
establishes the result
 State of the system at a given time
t
depends
only
on the arrivals
(and associated service times) before
t
Since the arrival processes before arrival times and random times
are identical, so is the state of the system they see
4&5-19
Arrivals that Do not See Time-Averages
Example 1: Non-Poisson arrivals
 IID inter-arrival times, uniformly distributed between in 2 and 4 sec
 Service times deterministic 1 sec
Upon arrival: system is always empty

λ=1/3, T=1 → N=T/λ=1/3 → p
1
=1/3
Example 2: LAA violated
 Poisson arrivals
 Service time of customer i: S
i
= αT
i+1
, α<1
Upon arrival: system is always empty
Average time the system has 1 customer: p
1
= α
4&5-20
Distribution after Departure
 Steady-state probabilities after departure:
 Under very general assumptions:
 N(t) changes in unit increments
 limits a
n
and exist d
n
a
n
= d
n
, n=0,1,…
In steady-state, system appears stochastically identical to an
arriving and departing customer

 Poisson arrivals + LAA: an arriving and a departing customer see
a system that is stochastically to the one seen by an observer
looking at an arbitrary time
|
departure at lim
{
()
}
n
t
dPXntt
+
→∞
==
4&5-21
M/M/* Queues
 Poisson arrival process
 Interarrival times: iid, exponential
 Service times: iid, exponential
 Service times and interarrival times: independent
 N(t): Number of customers in system at time t (state)
{N(t): t ≥ 0} can be modeled as a continuous-time
Markov chain
Transition rates depend on the characteristics of the
system
PASTA Theorem always holds
4&5-22
M/M/1/K Queue
 M/M/1 with finite waiting room
 At most K customers in the system

 Customer that upon arrival finds K customers in system is dropped
 Stationary distribution
 Stability condition: always stable – even if ρ≥1
 Probability of loss – using PASTA theorem:
0 1
KK-1
2
λ
µ
λ
µ
λ
µ
λ
µ
0
0
1
,1,2, ,
1
1
n
n
K
ppn K
p
ρ
ρ
ρ
+

==

=

1
(1 )
{loss} { ( ) }
1
K
K
PPNtK
ρ
ρ
ρ
+

===

4&5-23
M/M/1/K Queue (proof)
0 1
KK-1
2
λ
µ
λ
µ
λ
µ
λ

µ
 Exactly as in the M/M/1 queue:
 Normalization constant:
Generalize: Truncating a Markov chain
1
00
01
0
1
1
11 1
1
1

1
K
KK
n
n
nn
K
pp p
p
ρ
ρ
ρ
ρ
ρ
+
==

+

=⇒ =⇒ =


⇒=

∑∑
0
, 1, 2, ,
n
n
ppn K
ρ
==
4&5-24
Truncating a Markov Chain
 {X(t): t ≥ 0} continuous-time Markov chain with stationary
distribution {p
i
: i=0,1,…}
 S a subset of {0,1,…}: set of states; Observe process only in S
 Eliminate all states not in S
 Set
 {Y(t): t ≥ 0}: resulting truncated process; If irreducible:
 Continuous-time Markov chain
 Stationary distribution
Under certain conditions – need to verify depending on the system
if
0if

j
i
j
iS
p
j
S
p
p
j
S





=







0, ,
ji ij
qq jSiS== ∈ ∉

4&5-25
Truncating a Markov Chain (cont.)

 Possible sufficient condition
 Verify that distribution of truncated process
1. Satisfies the GBE
2. Satisfies the probability conservation law:
Another – even better – sufficient condition: DBE!
Relates to “reversibility”
Holds for multidimensional chains
() ()
,
j
i
j ji i ij j ji i ij ji ij
i i iS iS iS iS
jji iij jji iij
iS iS iS iS
p
p
p
qpqpqpq q q
p
SpS
p q pq p q pq j S
∈∈ ∈∈
∈∈ ∈∈
=⇒ =⇒ =
⇒=⇒= ∈
∑∑ ∑∑ ∑∑
∑∑ ∑∑

,

jji iij
iS iS
p
qpqjS
∉∉
=



()
1, ( )
() ()
i
ii
iS iS iS
ppS
p
pS p
pS pS
∈∈ ∈
=== ≡
∑∑ ∑

×