CS140-lecture-20211014

Recurrences cont #

Recursion tree method #

Another method to solve recurrences is to draw a recursion tree, where each node gets a cost. The cost of each node is the additional work done on each recursive call (not recursive call itself).

image_2021-10-14-14-05-32

The leaf nodes are the base cases. The idea is to identify a pattern, and use a known series to evaluate that pattern.

image_2021-10-14-14-13-14

Then, after identifying the sum of each level of the tree, you then sum all the level’s themselves (except the base level, the leaves). Notice that the height of this tree is \( \lg n \) . We can sum all the levels up to the last one (notice that \( k \) goes to \( \lg n \) ). Then, add the cost of the last level (there are \( 2^{\lg n} = n \) leaf nodes). So we can describe this summation like this:

\[\begin{aligned} \sum_{k=1}^{\lg n} \frac{n^2}{2^k} + n W(1) &= \frac{\frac{1}{2}^{(\lg n - 1) + 1} -1}{\frac{1}{2} - 1} n^2 + \Theta(n) \\ &= \left( 2 - \frac{2}{n} \right) n^2 + \Theta(n) \\ &= \Theta(n^2) \end{aligned}\]

image_2021-10-14-14-29-16

Its okay to also let \( n \to \infty \) and get an approximation:

image_2021-10-14-14-32-56

However, don’t make the approximation too loose.

image_2021-10-14-14-38-56

\[\begin{aligned} \sum_{k=0}^{\log_4 n - 1} 3^k c \left( \frac{n}{4^k} \right)^2 + n^{\log_3 4} T(1) &= cn^2 \sum_{k=0}^{\log_4 n - 1} \left( \frac{3}{4^2} \right)^k + \Theta(n^{\log_3 4}) \end{aligned}\]

image_2021-10-14-14-49-22 image_2021-10-14-14-50-56 image_2021-10-14-14-53-54 image_2021-10-14-14-55-57 image_2021-10-14-14-57-05

This gives an overall complexity of \( \Theta(n^2) \) .

Lets solve this same problem using the substitution method.

image_2021-10-14-15-06-12

Once this is setup, we can solve for \( d \) .

image_2021-10-14-15-08-20

Now, lets try an unbalanced tree:

image_2021-10-14-15-16-04

When we have an unbalanced tree, we need to find the height of the root to leaf nodes that reach the base case the fastest, and the height of the root to leaf nodes that reach the base case the slowest. When we start to sum each level, we can see each level costs \( n \) , until the tree becomes unbalanced. At this point, the cost is \( \leq n \) .

Note: If we assume the cost at each level is in fact \( n \) , then we will find an upper bound for the complexity. If we assume that the tree ends at the shortest path from root to leaf, then we will get the lower bound.

image_2021-10-14-15-22-25

If we find the lower bound and upper bound, and they are the same, then we can say we found the exact bound. So, the lower bound is

image_2021-10-14-15-28-05 image_2021-10-14-15-29-04

Now that we have found a lower bound, lets find the upper bound:

image_2021-10-14-15-35-09

Note: In the term \( n^{\log_\frac{3}{2} 2} \) , since the base in the log is less than the number, it means that the overall value is \( > 1 \) , so the polynomial term dominates the linearithmic term.

image_2021-10-14-15-37-11

Since our lower and upper bounds are not equivalent, we cannot say we have an exact bound (big theta).

We can get a better assumption but not overestimating as much. If we assume that each level costs the same all the way down, then NOT include the leaf node costs, This would give our upper bound as \( O(n \lg n) \) , which matches our lower bound. That means we have an overall complexity of \( \Theta(n \lg n) \) .

image_2021-10-14-15-44-48 image_2021-10-14-15-45-12

Quicksort recurrence #

image_2021-10-14-15-45-35 image_2021-10-14-15-46-32 image_2021-10-14-15-47-41 image_2021-10-14-15-51-32 image_2021-10-14-15-51-46 image_2021-10-14-15-54-19

Master’s method #

Masters method can be used when recurrences are in this exact form:

image_2021-10-14-16-01-42

Generally, we’ll have 3 cases:

image_2021-10-14-16-03-53

Lets visualize the first case with a recursion tree:

image_2021-10-14-16-11-50 image_2021-10-14-16-14-58

Mergesort recurrence using Master’s method #

So first we identify our variables \( a,b,f(n) \) and go thru the cases.

image_2021-10-14-16-17-24

The first case is false because \( \Theta(n) \not = O(n^{1-\epsilon}) \) . So we check the second case, which is true because \( \Theta(n) = \Theta(n) \) .

image_2021-10-14-16-19-21

So the Master’s method gives us a runtime complexity of \( \Theta(n \lg n) \) , which is the average case runtime of mergesort.

Some more recurrence examples using Master’s method #

image_2021-10-14-16-20-21 image_2021-10-14-16-22-42

The third case works for this example, and you usually need to find a \( \epsilon \) value between 0 and 1.

image_2021-10-14-16-24-15 image_2021-10-14-16-26-00 image_2021-10-14-16-30-38 image_2021-10-14-16-33-41

In this example, all 3 cases fail, so another method of solving recurrences it needed besides Master’s method.