Stochastic Models for Communication NetworksJean WalrandUniversity of California, Berkeley2Contents (tentative) Big Picture Store-and-Forward Packets, Transactions, Users Queuing Little’s Law and Applications Stability of Markov Chains Scheduling Markov Decision Problems Discrete Time DT: LP Formulation Continuous Time Transaction-Level Models Models of TCP Stability Random Networks Connectivity Results: Percolations3Big Picture Store-and-ForwardPacketsActual PacketsHeader Descriptors- Example: Linked ListsIN-FIFOSerial/ParallelOUT-FIFOParallel/SerialRxTxScheduler4Big Picture Packets Delay Backlog Transactions Bit rate Duration UsersRateTimeActiveIdleGet/Send files as Poissonprocess with given rateTransition rates maydepend on perceived QoS5Queuing Little’s Law Stability of Markov Chains6Little’s Law Roughly:Average number in system= Average arrival rate x Average time in systemL = λW Example:1000 packet arrivals/secondeach packet spends 0.1 second in systemÎ 100 packets in system, on average Notes: System does not have to be FIFO nor work-conserving; Applies to subset of customers True under weak assumptions (stability)7Little’s Law… Extension:Average income per unit time= Average arrival rate x Average cost per customerH = λG Example:1000 customer arrivals/secondeach customer spends $5.00 on averageÎ system gets $5,000.00 per second, on average8Little’s Law… Illustration 1: UtilizationPackets arrive at rate λ (per second)Average transmission time = 1/µ (seconds)Î Transmitter is busy a fraction λ/µ of the time Indeed:System = transmitterNumber in system = 1 if busy, 0 otherwiseW = Average time in system = 1/µL = Average number in system = fraction of time busyHence,Fraction of time busy = λ/µ =: link utilizationL = λWH = λGL = λWH = λG9Little’s Law… Illustration 2: Delay in M/G/1 queuePackets arrive at rate λ (per second)Transmission time = S; E(S) = 1/µ; var(S) = σ2; ρ = λ/µÎ Indeed:H = E(queuing delay) = E(Q) [see *]G = E(SQ + S2/2) = E(S)E(Q) + E(S2)/2where Q = queuing delayThus, E(Q) = λ{E(S)E(Q) + E(S2)/2}We solve for E(Q), then add E(S) to getthe expected delayL = λWH = λGL = λWH = λGSInstantaneous costof a given customer= residual service timearrivaltimestart ofservicedeparturetimeQ10Little’s Law… Illustration 2: Delay in M/G/1 queue…* We used the fact that a typical customer sees the typical delay in the system… This is true because the arrivals are Poisson. Indeed,P[ queue is in state x at t | arrival in (t, t + ε)]= P(queue is in state x at time t)because state at time t = f(past of Poisson process) and process has independent increments Note: Not true if not PoissonWork-to-be-doneTimeCustomer sees less thanaverage congestionCustomer sees more thanaverage congestion11Little’s Law… Illustration 2: Delay in M/G/1 queue… M/M/1: σ2= 1/µ2Î M/D/1: σ2= 0 Î σ2>> 1 Î Average delay is very large12Little’s Law… Illustration 3: Delay in M/G/1 queue with vacationsModel: Server goes on vacations every time the queue is empty. The vacations are i.i.d. and distributed like V.Then,where T0= average delay without vacations. Derivation:Assume servers pays U when his residual vacation is U and each customer pays as in M/G/1 queue. Then, the total expected pay is the average waiting time until service. This is λE(QS + S2/2) + αE(V2/2) = E(Q)where α = rate of vacations.To find α, note that the server is idle for E(V)αt = (1 – ρ)t seconds out of t >> 1 seconds. Hence, α = (1 – ρ)/E(V). Substituting gives the result.13Little’s Law…S D12Latest bit seen by time tat point 1 at point 2nDelay of bit n14S D12Little’s Law…NS = areaS = T(1) + … + T(N)= integral of X(t)T(N)T(N - 1)X(t)TT(1) + … + T(N)NNT1X(t)dt =TS=.TÆ Average occupancy = (average delay)x(average arrival rate)15Stability of Markov Chains Markov Chain (DT): Assume irreducible. Then all states are NR, PR, or T together (certainly PR if finite) there is 0 or 1 Inv. Dist.: πP = π; 1 if PR, 0 otherwise Moreover, for all state i, one has almost surely Finally, if PR and aperiodic, then16Stability of Markov Chains… Pakes’ Lemma: Assume irreducible and aperiodic on {0, 1, 2, …}. DefineAssume there is some i0and some aso that Then, the MC is PR Proof: The MC cannot stay away from {0, 1, …, i0}; If it does for k steps, E(Xn) decreases by ka … . Formally:17Stability of Markov Chains… Pakes’ Lemma … simple variation: Pakes’ Lemma … other variation: Same conclusion if there is some finite m such that has the properties indicated above.18Stability of Markov Chains… Application 1: (Inspired by TCP)First note that X(n) is irreducible and aperiodic. Also, the original form of Pakes’ Lemma applies.19Stability of Markov Chains… Application 2: (Inspired by switches)Virtual Output Buffer switch:λ(1)λ(2)λ(3)λ(4)Think of Bernoulli arrivals, cells of size 1 … . Note λ(1) + λ(2) < 1 and λ(3) + λ(4) < 1. At each time: X or =.Stability requires λ(1) + λ(3) < 1 and λ(2) + λ(4) < 1 .Maximum throughput scheduling: If this condition suffices.20Stability of Markov Chains… Application 2…Maximum Weighted Matching:A = 14B = 11C = 15D = 10B + C > A + D => Serve (B, C)21Stability of Markov Chains… Application 2 (Tassiulas, McKeown et al)Maximum Weighted Matching Æ maximum throughput.Proof: V(x) = ||x||2is a Lyapunov functionThat is, E[V(X(n+1) – V(X(n)) | X(n) = x] is finite and < - ε < 0 for x outside a finite set.22Stability of Markov Chains… Application 3A = 14B = 8C = 15D = 10Iterated Longest Queue:Serve longest queue, then next longest that is compatible, etc…. Here: C, B23Stability of Markov Chains… Application 3 (A. Dimakis)Iterated Longest Queue Æ maximum throughput.Proof: V(x) = maxi(xi) is a Lyapunov function.24Stability of Markov Chains… Application 4: WirelessL3L1L2Interference RadiusL2L1L3ConflictGraph:25Stability of Markov Chains…Resources: j∈ JClasses: k∈ Kconflict graphλ2λ3λ1123 Possible matches: {1},{2,3} Maximum Weight Matching: {2,3} Longest Queue First (LQF): {1}26Stability of Markov Chains… As in previous example: Consider fluid limits; Use longest queue size as Lyapunov function. Carries over to conflict graph topologies that strictly include trees.
View Full Document