Recurrences(CLRS 4.1-4.2)• Last time we discussed divide-and-conquer algorithmsDivide and ConquerTo Solve P:1. Divide P into smaller problems P1, P2, P3.....Pk.2. Conquer by solving the (smaller) subproblems recursively.3. Combine solutions to P1, P2, ...Pkinto solution for P.• Analysis of divide-and-conquer algorithms and in general of recurs ive algorithms leads torecurrences.• Merge-sort lead to the recurrence T (n) = 2T (n/2) + n– or rather, T (n) =(Θ(1) If n = 1T (dn2e) + T (bn2c) + Θ(n) If n > 1– but we will often cheat and just solve the simple formula (equivalent to assuming thatn = 2kfor some constant k, and leav ing out base case and constant in Θ).Methods for solving recurrences1. Substitution m ethod2. Iteration method• Recursion-tree method• (Master method)1 Solving Recurrences with the Substitution Method• Idea: Make a guess for the form of the solution and prove by induction.• Can be used to prove both upper bounds O() and lower bounds Ω().• Let’s solve T (n ) = 2T (n/2) + n using su bstitution– Guess T (n) ≤ cn log n for some constant c (that is, T (n) = O(n log n))– Proof:∗ Base case: we need to show that our guess holds for some base case (not necessarilyn = 1, s ome small n is ok). Ok, since fu nction constant for sm all constant n.∗ Assume holds for n/2: T (n/2) ≤ cn2logn2(Question: Why not n − 1?)Prove that holds for n: T (n) ≤ cn log nT (n) = 2T (n/2) + n≤ 2(cn2logn2) + n= cn logn2+ n= cn log n − cn log 2 + n= cn log n − cn + nSo ok if c ≥ 1• Similarly it can be shown that T (n ) = Ω(n log n)Exercise!• Similarly it can be shown that T (n ) = T (bn2c) + T (dn2e) + n is Θ(n lg n).Exercise!• The hard part of th e substitution method is often to make a good guess. How do we makea good (i.e. tight) guess??? Unfortunately, there’s no “recipe” for this one. Try iterativelyO(n3), Ω(n3), O(n2), Ω(n2) and so on. Try solving by iteration to get a feeling of the growth.2 Solving Recurrences with the Iteration/Recursion-tree Method• In the iteration method we iteratively “unfold” the recurrence until we “see the pattern”.• The iteration method does not require m aking a good guess like the sub s titution method (b utit is often more involved than using induction).• Example: Solve T (n) = 8T (n/2) + n2(T (1) = 1)T (n) = n2+ 8T (n/2)= n2+ 8(8T(n22) + (n2)2)= n2+ 82T (n22) + 8(n24))= n2+ 2n2+ 82T (n22)= n2+ 2n2+ 82(8T (n23) + (n22)2)= n2+ 2n2+ 83T (n23) + 82(n242))= n2+ 2n2+ 22n2+ 83T (n23)= . . .= n2+ 2n2+ 22n2+ 23n2+ 24n2+ . . .– R ecur sion depth: How long (how many iterations) it takes until the subproblem hasconstant size? i times wheren2i= 1 ⇒ i = log n– Wh at is the last term? 8iT (1) = 8log nT (n) = n2+ 2n2+ 22n2+ 23n2+ 24n2+ . . . + 2log n−1n2+ 8log n=log n−1Xk=02kn2+ 8log n= n2log n−1Xk=02k+ (23)log n• NowPlog n−1k=02kis a geometric sum so we havePlog n−1k=02k= Θ(2log n−1) = Θ(n)• (23)log n= (2log n)3= n3T (n) = n2· Θ(n) + n3= Θ(n3)2.1 Recursion treeA different way to look at the iteration method: is the recursion-tree, discussed in the book (4.2).• we draw out the recursion tree with cost of single call in each node—running time is sum ofcosts in all nodes• if you are careful d rawing the recursion tree and summing up the costs, the recursion tree isa direct proof for the solution of the recurrence, just like iteration and substitution• Example: T (n) = 8T (n/2) + n2(T (1) = 1)(n/2)2(n/2)2(n/4)2(n/4)2n2n2(n/2)2(n/2)2T(n/4)n2T(n/4)T(n)1)T(n/2) T(n/2)n22)T(1) T(1) T(1)8(n/2) = 2n8 (n/4) = 2 n2 222 22log n8 T(1)++++3)log n)T (n) = n2+ 2n2+ 22n2+ 23n2+ 24n2+ . . . + 2log n−1n2+ 8log n3 Matrix Multiplication• Let X and Y be n × n matricesX =x11x12··· x1nx21x22··· x1nx31x32··· x1n··· ··· ··· ···xn1xn2··· xnn• We want to compute Z = X · Y– zij=Pnk=1Xik· Ykj• Naive method uses ⇒ n2· n = Θ(n3) operations• Divide-and-conquer solution:Z =(A BC D)·(E FG H)=((A ·E + B · G) (A · F + B · H)(C ·E + D · G) (C · F + D · H))– The above natur ally leads to divide-and-conquer solution:∗ Divide X and Y into 8 sub-matrices A, B, C, and D.∗ Do 8 matrix multiplications recursively.∗ Compute Z by combin ing results (doing 4 matrix additions).– Lets assum e n = 2cfor some constant c and let A, B, C an d D be n/2 ×n/2 matrices∗ Running time of algorithm is T (n) = 8T (n/2) + Θ(n2) ⇒ T (n) = Θ(n3)– Bu t we already discussed a (simpler/naive) O(n3) algorithm! Can we do better?3.1 Strassen’s Algorithm• Strassen obs erved the following:Z =(A BC D)·(E FG H)=((S1+ S2−S4+ S6) (S4+ S5)(S6+ S7) (S2+ S3+ S5−S7))whereS1= (B − D) · (G + H)S2= (A + D) · (E + H)S3= (A − C) · (E + F )S4= (A + B) ·HS5= A · (F −H)S6= D · (G − E)S7= (C + D) · E– Lets test that S6+ S7is really C · E + D · GS6+ S7= D · (G − E) + (C + D) · E= DG − DE + CE + DE= DG + CE• This leads to a divide-and-conquer algorithm with running time T (n) = 7T (n/2) + Θ(n2)– We only need to perform 7 multiplications recursively.– Division/Combination can still be performed in Θ(n2) time.• Lets solve the recurrence us ing the iteration methodT (n) = 7T (n/2) + n2= n2+ 7(7T(n22) + (n2)2)= n2+ (722)n2+ 72T (n22)= n2+ (722)n2+ 72(7T (n23) + (n22)2)= n2+ (722)n2+ (722)2· n2+ 73T (n23)= n2+ (722)n2+ (722)2n2+ (722)3n2.... + (722)log n−1n2+ 7log n=log n−1Xi=0(722)in2+ 7log n= n2· Θ((722)log n−1) + 7log n= n2· Θ(7log n(22)log n) + 7log n= n2· Θ(7log nn2) + 7log n= Θ(7log n)– Now we have the following:7log n= 7log7nlog72= (7log7n)(1/ log72)= n(1/ log72)= nlog27log22= nlog 7– Or in general: alogkn= nlogkaSo the solution is T (n) = Θ(nlog 7) = Θ(n2.81...)• Note:– We are ’hiding’ a much bigger constant in Θ () than before.– Currently best known bound is O(n2.376..) (another method).– Lower bound is (trivially) Ω(n2).4 Master M ethod• We have solved several recurrences using substitution and iteration.• we solved several recurrences of the form T (n) = aT (n/b) + nc(T (1) = 1).– Strassen’s algorithm ⇒ T (n) = 7T (n/2) + n2(a = 7, b = 2, and c = 2)– Merge-sort ⇒ T (n) = 2T (n/2) + n (a = 2, b = 2, and c = 1).• It would be nice to have a general solution to the recurren ce T (n) = aT (n/b) + nc.• We do!T (n) = aTnb+ nca ≥ 1, b ≥ 1, c > 0⇓T (n) =Θ(nlogba) a >
View Full Document