Tutorials on Probability BoundsKanat Tangwongsan([email protected])Computer Science DepartmentCarnegie Mellon UniversitySeptember 27, 2007Lecture 10: Probability BoundsOutlineMain Focus: Chernoff bounds and their applications to provinghigh-probability bounds.Basic bounds: Markov’s inequality, Chebyshev’s inequalityChernoff boundsExamplesTree contraction, line breaking (revisited)Lecture 10: Probability BoundsBasic BoundsMarkov’s inequalityFor any random variable X ≥ 0,Pr[X > λ] <E[X]λIf f : X → R+is a positive function, we havePr[f(X) > f(λ)] <E[f(X)]f(λ).If f is a non-decreasing function, we get thatPr[X > λ] = Pr[f(X) > f(λ)].(Blackboard interlude)Lecture 10: Probability BoundsBasic Bounds (contd.)Chebyshev’s inequalityFor any random variable X ∈ R,Pr[|X − E[X]| ≥ λ] ≤Var(X)λ2Lecture 10: Probability BoundsChernoff BoundsLet X =PiXi, where E[Xi] = piXi’s are independent.Will show: For 0 < δ < 1,Pr[X < (1 − δ)µ] ≤ e−µδ2/2.Useful facts1 + x ≤ ex(1 − x) ln(1 − x) > −δ + δ2/2 for 0 < δ < 1.(Blackboard interlude)Lecture 10: Probability BoundsChernoff BoundsThe following bounds can be shown similarly:1For δ > 0,Pr[X < (1 − δ)µ] <e−δ(1 − δ)1−δµ.2For 0 < δ < 1,Pr[X < (1 − δ)µ] ≤ e−µδ2/2.3For any δ > 0,Pr[X ≥ (1 + δ)µ] <eδ(1 + δ)1+δµ.4For any 0 < δ ≤ 1,Pr[X ≥ (1 + δ)µ] ≤ e−µδ2/3.5For R ≥ 6µ,Pr[X ≥ R] ≤ 2−R.Lecture 10: Probability BoundsRandomized (Lazy) Quick SortRand-QSort(A) =0. If |A| ≤ 1 return A.1. Pick e ∈ A uniformly at random.2. Split A into A<and A>.3. Go to Step 1 if max{|A<|, |A>|} >34|A|.4. returnRand-QSort(A<)++{e}++Rand-QSort(A>)Lecture 10: Probability BoundsParallel Tree Contraction (Miller and Reif, 1985)Each contraction step performs rake and compress (in the orderyou wish):Rake. Contract all leaves into parent nodes.Compress. Shorten long chains of one-child nodes byemploying a standard coin-flipping algorithm. All nodes flipcoins; if a node comes up “heads”, and has a single child thatcomes up “tails”, it contracts that child into itself.In Your Homework: 1) Each contraction removes at least aconstant fraction of nodes w.h.p. 2) The algorithm terminates inO(log n) steps w.h.p.Lecture 10: Probability BoundsLine breakingTwo major components:If I start a line, who should start the next line?You have a tree, what do you do with it?Lecture 10: Probability
View Full Document