1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

37
1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    216
  • download

    1

Transcript of 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

Page 1: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

1

TCOM 501: Networking Theory & Fundamentals

Lecture 2January 22, 2003

Prof. Yannis A. Korilis

Page 2: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-2 Topics

Delay in Packet Networks Introduction to Queueing Theory Review of Probability Theory The Poisson Process Little’s Theorem

Proof and Intuitive Explanation Applications

Page 3: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-3 Sources of Network Delay Processing Delay

Assume processing power is not a constraint Queueing Delay

Time buffered waiting for transmission Transmission Delay Propagation Delay

Time spend on the link – transmission of electrical signal

Independent of traffic carried by the link

Focus: Queueing & Transmission Delay

Page 4: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-4 Basic Queueing Model

A queue models any service station with: One or multiple servers A waiting area or buffer

Customers arrive to receive service A customer that upon arrival does not find a free server is waits in the buffer

Arrivals Departures

Buffer Server(s)

Queued In Service

Page 5: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-5 Characteristics of a Queue

Number of servers m: one, multiple, infinite Buffer size b Service discipline (scheduling): FCFS, LCFS, Processor Sharing (PS), etcArrival processService statistics

mb

Page 6: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-6 Arrival Process

: interarrival time between customers n and n+1

is a random variable is a stochastic process

Interarrival times are identically distributed and have a common mean

is called the arrival rate

n 1n 1n n

nt t

nn

{ , 1}n n

[ ] [ ] 1/nE E

Page 7: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-7 Service-Time Process

: service time of customer n at the server is a stochastic process

Service times are identically distributed with common mean

is called the service rate

For packets, are the service times really random?

n 1n 1n

ns

t

ns

{ , 1}ns n

[ ] [ ]nE s E s

Page 8: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-8 Queue Descriptors Generic descriptor: A/S/m/k A denotes the arrival process

For Poisson arrivals we use M (for Markovian) B denotes the service-time distribution

M: exponential distribution D: deterministic service times G: general distribution

m is the number of servers k is the max number of customers allowed in

the system – either in the buffer or in service k is omitted when the buffer size is infinite

Page 9: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-9 Queue Descriptors: Examples

M/M/1: Poisson arrivals, exponentially distributed service times, one server, infinite buffer

M/M/m: same as previous with m servers M/M/m/m: Poisson arrivals, exponentially

distributed service times, m server, no buffering

M/G/1: Poisson arrivals, identically distributed service times follows a general distribution, one server, infinite buffer

*/D/∞ : A constant delay system

Page 10: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-10 Probability Fundamentals

Exponential Distribution Memoryless Property Poisson Distribution Poisson Process

Definition and Properties Interarrival Time Distribution Modeling Arrival and Service Statistics

Page 11: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-11 The Exponential Distribution

A continuous RV X follows the exponential distribution with parameter , if its probability density function is:

Probability distribution function:1 if 0( ) { }

0 if 0

x

X

e xF x P X x

x

if 0( )

0 if 0

x

X

e xf x

x

Page 12: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-12 Exponential Distribution (cont.) Mean and Variance:

Proof:

2

1 1[ ] , Var( )E X X

0 0

0 0

2 2 20 20 0

2 22 2 2

[ ] ( )

1

2 2[ ] 2 [ ]

2 1 1Var( ) [ ] ( [ ])

xX

x x

x x x

E X x f x dx x e dx

xe e dx

E X x e dx x e xe dx E X

X E X E X

Page 13: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-13 Memoryless Property Past history has no influence on the future

Proof:

Exponential: the only continuous distribution with the memoryless property

{ | } { }P X x t X t P X x

( )

{ , } { }{ | }

{ } { }

{ }x t

xt

P X x t X t P X x tP X x t X t

P X t P X t

ee P X x

e

Page 14: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-14 Poisson Distribution A discrete RV X follows the Poisson distribution

with parameter if its probability mass function is:

Wide applicability in modeling the number of random events that occur during a given time interval – The Poisson Process: Customers that arrive at a post office during a day Wrong phone calls received during a week Students that go to the instructor’s office during office

hours … and packets that arrive at a network switch

{ } , 0,1,2,...!

k

P X k e kk

Page 15: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-15 Poisson Distribution (cont.) Mean and Variance

Proof: [ ] , Var( )E X X

0 0 0

0

2 2 2

0 0 0

2

0 0 0

2 2 2

[ ] { }! ( 1)!

!

[ ] { }! ( 1)!

( 1)! ! !

Var( ) [ ] ( [ ])

k k

k k k

j

j

k k

k k k

j j j

j j j

E X kP X k e k ek k

e e ej

E X k P X k e k e kk k

e j je ej j j

X E X E X

2

Page 16: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-16

Sum of Poisson Random Variables

Xi , i =1,2,…,n, are independent RVs

Xi follows Poisson distribution with parameter i

Partial sum defined as:

Sn follows Poisson distribution with parameter

1 2 ...n nS X X X

1 2 ... n

Page 17: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-17

Sum of Poisson Random Variables (cont.)

P roof: For n = 2. Generalization by induc-tion. T he pmf of S = X 1 + X 2 is

P f S = mg =mX

k=0P f X 1 = k;X 2 = m ¡ kg

=mX

k=0P f X 1 = kgP f X 2 = m ¡ kg

=mX

k=0e¡ ¸1¸k

1k!

¢e¡ ¸2¸m¡ k

2(m ¡ k)!

= e¡ (¸1+¸2) 1

m!

mX

k=0

m!

k!(m ¡ k)!¸k

1¸m¡ k2

= e¡ (¸1+¸2) (¸1 + ¸2)m

m!Poisson with parameter ¸ = ¸1 + ¸2.

Page 18: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-18 Sampling a Poisson Variable

X follows Poisson distribution with parameter Each of the X arrivals is of type i with

probability pi, i =1,2,…,n, independently of other arrivals; p1 + p2 +…+ pn = 1

Xi denotes the number of type i arrivals

X1 , X2 ,…Xn are independent

Xi follows Poisson distribution with parameter ipi

Page 19: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-19

Sampling a Poisson Variable (cont.)P roof: For n = 2. Generalize by induction. J oint pmf:

P f X 1 = k1;X 2 = k2g =

= P f X 1 = k1;X 2 = k2jX = k1 + k2gP f X = k1 + k2g

=³ k1 + k2

k1

´pk1

1 pk2

2 ¢e¡ ¸ ¸k1+k2

(k1 + k2)!

=1

k1!k2!(¸p1)k1(¸p2)k2 ¢e¡ ¸(p1+p2)

= e¡ ¸p1(¸p1)k1

k1!¢e¡ ¸p2

(¸p2)k2

k2!

² X 1 and X 2 are independent

² P f X 1 = k1g = e¡ ¸p1 (¸p1)k1

k1! , P f X 2 = k2g = e¡ ¸p2 (¸p2)k2

k2!

X i follows Poisson distribution with parameter ¸pi.

Page 20: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-20

Poisson Approximation to Binomial

Binomial distribution with parameters (n, p)

As n→∞ and p→0, with np= moderate, binomial distribution converges to Poisson with parameter

Proof:

{ } (1 )k n knP X k p p

k

{ } (1 )

( 1)...( 1)1

( 1)...( 1)1

1

1 1

{ }!

!

k n k

n k

nk

n

n

k

n

k

k

n

nP X k p p

k

n k n n

n n

n k n n

n

en

n

Pk

k

X k e

Page 21: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-21 Poisson Process with Rate {A(t): t≥0} counting process

A(t) is the number of events (arrivals) that have occurred from time 0 – when A(0)=0 – to time t

A(t)-A(s) number of arrivals in interval (s, t] Number of arrivals in disjoint intervals independent Number of arrivals in any interval (t, t+] of length Depends only on its length Follows Poisson distribution with parameter

Average number of arrivals ; is the arrival rate

( ){ ( ) ( ) } , 0,1,...

!

n

P A t A t n e nn

Page 22: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-22 Interarrival-Time Statistics Interarrival times for a Poisson process are independent

and follow exponential distribution with parameter tn: time of nth arrival; n=tn+1-tn: nth interarrival time

{ } 1 , 0snP s e s

Proof: Probability distribution function

Independence follows from independence of number of arrivals in disjoint intervals

{ } 1 { } 1 { ( ) ( ) 0} 1 sn n n nP s P s P A t s A t e

Page 23: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-23 Small Interval Probabilities

Interval (t+ , t] of length { ( ) ( ) 0} 1 ( )

{ ( ) ( ) 1} ( )

{ ( ) ( ) 2} ( )

P A t A t

P A t A t

P A t A t

Proof:2

2

1

0

( ){ ( ) ( ) 0} 1 1 ( )

2( )

{ ( ) ( ) 1} 1 ( )2

{ ( ) ( ) 2} 1 { ( ) ( ) }

1 (1 ( )) ( ( )) ( )k

P A t A t e

P A t A t e

P A t A t P A t A t k

Page 24: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-24

Merging & Splitting Poisson Processes

A1,…, Ak independent Poisson processes with rates 1,…, k

Merged in a single processA= A1+…+ Ak

A is Poisson process with rate= 1+…+ k

A: Poisson processes with rate Split into processes A1 and A2

independently, with probabilities p and 1-p respectively

A1 is Poisson with rate 1= p

A2 is Poisson with rate 2= (1-p)

p

(1-p)

p

1-p

Page 25: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-25 Modeling Arrival Statistics Poisson process widely used to model packet

arrivals in numerous networking problems Justification: provides a good model for aggregate

traffic of a large number of “independent” users n traffic streams, with independent identically distributed

(iid) interarrival times with PDF F(s) – not necessarily exponential

Arrival rate of each stream nAs n→∞, combined stream can be approximated by Poisson under mild conditions on F(s) – e.g., F(0)=0, F’(0)>0

Most important reason for Poisson assumption:Analytic tractability of queueing models

Page 26: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-26 Little’s Theorem

: customer arrival rate N: average number of customers in system T: average delay per customer in system

Little’s Theorem: System in steady-state

N T

N

T

Page 27: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-27 Counting Processes of a Queue

N(t) : number of customers in system at time t (t) : number of customer arrivals till time t (t) : number of customer departures till time t Ti : time spent in system by the ith customer

(t)

N(t)

t

(t)

Page 28: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-28 Time Averages Time average over interval

[0,t] Steady state time averages

Little’s theorem N=λT Applies to any queueing

system provided that:Limits T, λ, and exist, and λ=

We give a simple graphical proof under a set of more restrictive assumptions

0

( )

1

1( ) lim

( )lim

1lim

( )

( )lim

t

t tt

t tt

a t

t i tt

i

t tt

N N s ds N Nta t

t

T T T Ta t

t

t

Page 29: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-29

Proof of Little’s Theorem for FCFS

Assumption: N(t)=0, infinitely often. For any such t

If limits Nt→N, Tt→T, λt→λ exist, Little’s formula followsWe will relax the last assumption

FCFS system, N(0)=0(t) and (t): staircase graphs N(t) = (t)- (t)

Shaded area between graphs

t

0( ) ( )

tS t N s ds

(t)

T1

N(t)

T2

Tii

(t)

( )

1

( )

0 01

1 ( )( ) ( )

( )

t

i

tt t

i t t ti

TtN s ds T N s ds N T

t t t

Page 30: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-30 Proof of Little’s for FCFS (cont.)

In general – even if the queue is not empty infinitely often:

Result follows assuming the limits Tt →T, λt→λ, and t→ exist, and λ=

(t)

T1

N(t)

T2

Tii

(t)

( ) ( )

1 1

( ) ( )

0 01 1

( ) 1 ( )( ) ( )

( ) ( )

t t

i i

t tt t

i ii i

t t t t t

T Tt tT N s ds T N s ds

t t t t t

T N T

Page 31: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-31

Probabilistic Form of Little’s Theorem Have considered a single sample function for a

stochastic process Now will focus on the probabilities of the

various sample functions of a stochastic process

Probability of n customers in system at time t

Expected number of customers in system at t

( ) { ( ) }np t P N t n

0 0

[ ( )] . { ( ) } ( )nn n

E N t n P N t n np t

Page 32: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-32

Probabilistic Form of Little (cont.) pn(t), E[N(t)] depend on t and initial distribution at t=0 We will consider systems that converge to steady-state there exist pn independent of initial distribution

Expected number of customers in steady-state [stochastic aver.]

For an ergodic process, the time average of a sample function is equal to the steady-state expectation, with probability 1.

lim ( ) , 0,1,...n nt

p t p n

0

lim [ ( )]nt

n

EN np E N t

lim lim [ ( )]tt t

N N E N t EN

Page 33: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-33

Probabilistic Form of Little (cont.) In principle, we can find the probability distribution of the

delay Ti for customer i, and from that the expected value E[Ti], which converges to steady-state

For an ergodic system

Probabilistic Form of Little’s Formula: Arrival rate define as

lim [ ]iiET E T

1lim lim [ ]i

ii i

TT E T ET

i

.EN ET

[ ( )]limt

E t

t

Page 34: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-34 Time vs. Stochastic Averages

“Time averages = Stochastic averages,” for all systems of interest in this course

It holds if a single sample function of the stochastic process contains all possible realizations of the process at t→∞

Can be justified on the basis of general properties of Markov chains

Page 35: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-35 Moment Generating Function

1. De¯nition: for any t 2 IR:

M X (t) = E [etX ] =

8>><

>>:

Z 1

¡ 1etxf X (x) dx; X continuous

X

j

etxj P f X = xj g; X discrete

2. If the moment generating function M X (t) of Xexists and is ¯ nite in some neighborhood of t = 0,it determines the distribution of X uniquely.

3. Fundamental P roperties: for any n 2 IN :

(i)dn

dtnM X (t) = E [X netX ]

( ii)dn

dtnM X (0) = E [X n]

4. M oment Generating Functions and Independence:

X ;Y : independent ) M X +Y (t) = M X (t)M Y (t)

T he opposite is not true.

Page 36: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-36 Discrete Random Variables

Distribution Prob. Mass Fun. Moment Gen. Fun. Mean Variance(parameters) P fX = kg MX (t) E [X ] Var(X )

Binomial¡ n

k

¢pk(1¡ p)n¡ k (pet +1¡ p)n np np(1¡ p)

(n;p) k = 0;1;: ::;n

Geometric (1¡ p)k¡ 1p pet

1¡ (1¡ p)et1p

1¡ pp2

p k = 1;2;:::

NegativeBin.³

k¡ 1r¡ 1

´pr(1¡ p)k¡ r

hpet

1¡ (1¡ p)et

i r rp

r(1¡ p)p2

(r;p) k = r;r + 1;:::

Poisson e¡ ¸ ¸ k

k! e (et ¡ 1) ¸ ¸¸ k = 0;1;:::

Page 37: 1 TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis.

2-37 Continuous Random Variables

Distribution Prob. Density Fun. Moment Gen. Fun. Mean Variance(parameters) f X (x) MX (t) E [X ] Var(X )

Uniform over 1b¡ a

etb¡ eta

t(b¡ a)a+b

2(b¡ a)2

12

(a;b) a < x < b

Exponential ¸e¡ ¸x ¸¸¡ t

¸ x ¸ 0

Normal 1p2¼¾e¡ (x¡ ¹ )2=2¾2

e¹ t+(¾t)2=2 ¹ ¾2

(¹ ;¾2) ¡ 1 < x < 1