Recursion Trees and Master Method

4
CS 3110 Lecture 20 Recursion trees and master method for recurrence relations Note: this page uses the following special characters: Greek capital letter theta: (Θ), Greek capital letter omega (Ω), minus sign (−). If these characters do not appear correctly, your browser is not able to fully handle HTML 4.0, and some of the following text will likely not have the correct appearance. Recursion trees A recursion tree is useful for visualizing what happens when a recurrence is iterated. It diagrams the tree of recursive calls, and the amount of work done at each call. For instance consider the recurrence T(n)=2T(n/2) + n 2 . The recursion tree for this recurrence is of the following form: | n 2 | / \ | (n/2) 2 (n/2) 2 height= | / \ / \ lg n | (n/4) 2 (n/4) 2 (n/4) 2 (n/4) 2 | / \ / \ / \ / \ | . | . | . Generally it is straightforward to sum across each row of the tree, to obtain the total work done at a given level: | n 2 n 2 | / \ | (n/2) 2 (n/2) 2 (1/2)n 2 height= | / \ / \ lg n | (n/4) 2 (n/4) 2 (n/4) 2 (n/4) 2 (1/4)n 2 | / \ / \ / \ / \ | . | . | . This is a geometric series, and thus in the limit the sum is O(n 2 ). In other words the depth of the tree in this case does not really matter, the amount of work at each level is decreasing so quickly that the total is only a constant factor more than the root. Recursion trees can be useful for gaining intuition into the closed form of a recurrence, but are not a proof (and in fact it is easy to get the wrong answer with a recursion tree, as is the case with any method that includes ''...'' kinds of reasoning). As we saw last time, a good way of establishing a closed form for a recurrence is to guess an answer and then prove by induction that the answer is correct. Recurrence trees can be a good method of guessing an

description

Recursion Trees and Master Method

Transcript of Recursion Trees and Master Method

Page 1: Recursion Trees and Master Method

CS 3110 Lecture 20Recursion trees and master method for recurrencerelations

Note: this page uses the following special characters: Greek capital letter theta: (Θ), Greek capital letteromega (Ω), minus sign (−). If these characters do not appear correctly, your browser is not able to fully handleHTML 4.0, and some of the following text will likely not have the correct appearance.

Recursion trees

A recursion tree is useful for visualizing what happens when a recurrence is iterated. Itdiagrams the tree of recursive calls, and the amount of work done at each call.

For instance consider the recurrence

T(n)=2T(n/2) + n2.

The recursion tree for this recurrence is of the following form:

| n2

| / \

| (n/2)2 (n/2)2

height= | / \ / \

lg n | (n/4)2 (n/4)2 (n/4)2 (n/4)2

| / \ / \ / \ / \ | . | . | .

Generally it is straightforward to sum across each row of the tree, to obtain the total workdone at a given level:

| n2 n2

| / \

| (n/2)2 (n/2)2 (1/2)n2

height= | / \ / \

lg n | (n/4)2 (n/4)2 (n/4)2 (n/4)2 (1/4)n2

| / \ / \ / \ / \ | . | . | .

This is a geometric series, and thus in the limit the sum is O(n2). In other words the depth ofthe tree in this case does not really matter, the amount of work at each level is decreasing soquickly that the total is only a constant factor more than the root.

Recursion trees can be useful for gaining intuition into the closed form of a recurrence, butare not a proof (and in fact it is easy to get the wrong answer with a recursion tree, as is thecase with any method that includes ''...'' kinds of reasoning). As we saw last time, a good wayof establishing a closed form for a recurrence is to guess an answer and then prove byinduction that the answer is correct. Recurrence trees can be a good method of guessing an

Page 2: Recursion Trees and Master Method

answer.

Let's consider another example,

T(n)=T(n/3)+2T(n/3)+n.

Expanding out the first few levels, the recurrence tree is:

| n n | / \ | (n/3) (2n/3) n height= | / \ / \ log3/2 n | (n/9) (2n/9) (2n/9) (4n/9) n | / \ / \ / \ / \ | . | . | .

Note that the tree here is not balanced, the longest path keeps reducing n by a factor of 2/3and thus is of length log3/2 n. Hence our guess as to the closed form of this recurrence is O(n

lg n).

The master method

The “master method” is a cookbook method for solving recurrences that is very handy fordealing with many recurrences seen in practice. Suppose you have a recurrence of the form

T(n)=aT(n/b)+f(n).

In other words with a subproblems each of size n/b, where the work to split the probleminto subproblems and recombine the results is f(n).

We can visualize this as a recurrence tree, where the nodes in the tree have a branchingfactor of a. The top node has work f(n) associated with it, the next level has work f(n/b)associated with each node, the next level has work f(n/b2) associated with each node, and so

on. The tree has logbn levels, so the total number of leaves in the tree is alogbn which, as a

function of n is nlogba.

The time taken is just the sum of the terms f(n/bi) at all the nodes. What this sum looks likedepends on how the asymptotic growth of f(n) compares to the asymptotic growth of thenumber of leaves. There are three cases:

Case 1: f(n) is O(nlogba ­ ε). Since the leaves grow faster than f, asymptotically all ofthe work is done at the leaves, and so T(n) is Θ(nlogb a).

Case 2: f(n) is Θ(nlogba). The leaves grow at the same rate as h, so the same order ofwork is done at every level of the tree. The tree has O(lg n) levels, times the work doneon one level, yielding T(n) is &Theta(nlogb a lg n).Case 3: f(n) is Ω(nlogba + ε). In this case we also need to show that af(n/b)≤kf(n) forsome constant k and large n, which means that f grows faster than the number ofleaves. Asymptotically all of the work is done at the root node, so T(n) is Θ(f(n)).

Page 3: Recursion Trees and Master Method

Note that the master method does not always apply. In fact the second example consideredabove, where the subproblem sizes are unequal, is not covered by the master method.

Let's look at a few examples where the master method does apply.

Example 1 Consider the recurrence

T(n)=4T(n/2)+n.

For this recurrence, there are a=4 subproblems, each dividing the input by b=2, and the

work done on each call is f(n)=n. Thus nlogba is n2, and f(n) is O(n2­ε) for ε=1, and Case 1

applies. Thus T(n) is Θ(n2).

Example 2 Consider the recurrence

T(n)=4T(n/2)+n2.

For this recurrence, there are again a=4 subproblems, each dividing the input by b=2, but

now the work done on each call is f(n)=n2. Again nlogba is n2, and f(n) is thus Θ(n2), so Case

2 applies. Thus T(n) is Θ(n2 lg n). Note that increasing the work on each recursive call fromlinear to quadratic has increased the overall asymptotic running time only by a logarithmicfactor.

Example 3 Consider the recurrence

T(n)=4T(n/2)+n3.

For this recurrence, there are again a=4 subproblems, each dividing the input by b=2, but

now the work done on each call is f(n)=n3. Again nlogba is n2, and f(n) is thus Ω(n2+ε) for ε=1.

Moreover, 4(n/2)3 ≤ kn3 for k=1/2, so Case 3 applies. Thus T(n) is Θ(n3).

Example: Yet another sorting algorithm

The following function sorts the first two­thirds of a list, then the second two­thirds, then thefirst two­thirds again:

let rec sort3 a = match a with [] ­> [] | [x] ­> [x] | [x;y] ­> [(min x y); (max x y)] | _ ­> let n = List.length(a) in let m = (2*n+2) / 3 in let res1 = sort3(take a m) @ (drop a m) in let res2 = (take res1 (n­m)) @ sort3(drop res1 (n­m)) in sort3(take res2 m) @ (drop res2 m)

Perhaps surprisingly, this algorithm does sort the list. We leave the proof that it sortscorrectly as an exercise to the reader. The key is to observe that the first two passes ensurethat the tail of the final list does contain the correct elements in the correct order.

The running time of the algorithm we can derive from its recurrence. The routine doessome O(n) work and then makes three recursive calls on lists of length 2n/3. Therefore itsrecurrence is:

Page 4: Recursion Trees and Master Method

T(n) = cn + 3T(2n/3)

If we apply the master method to the sort3 algorithm, we see easily that we are in case 1,

so the algorithm is O(nlog3/23) = O(n2.71), making it even slower than insertion sort! Notethat the fact that Case 1 applies means that improving f(n) will not improve the overall time.For instance replacing lists with arrays improves f(n) to constant from linear time, but the

asymptotic complexity is still O(n2.71).