You are on page 1of 28

Analysis of Algorithms II

The Recursive Case


Except as otherwise noted, the content of this presentation is licensed under the Creative Commons Attribution 2.5 License.

Key Topics
* Recurrence Relations * Solving Recurrence Relations * The Towers of Hanoi * Analyzing Recursive Subprograms

Recurrence Relations
Recall that a recursive or inductive definition has the following parts:
1.

Base Case: the initial condition or basis which defines the first (or first few) elements of the sequence Inductive (Recursive) Case: an inductive step in which later terms in the sequence are defined in terms of earlier terms.

2.

A recurrence relation for a sequence a1, a2, a3, ... is a formula that relates each term a k to certain of its predecessors ak-1, ak-2, ..., a k-i, where i is a fixed integer and k is any integer greater than or equal to i. The initial conditions for such a recurrence relation specify the values of a1, a2, a3, ..., ai-1.

Example 1
a) One way to solve this problem is enumeration: length 0: empty string length 1: 0, 1 length 2: 00, 01, 10, 11 length 3: 000, 001, 010, 011, 100, 101, 110, 111 1 2 3 5

b) Suppose the number of bit strings of length less than some integer k that do not contain the 11 pattern is known (it just so happens that we do know it for 0, 1, 2, 3). To find the number of strings of length k that do not contain the bit pattern 11, we can try to express strings that have a certain property in terms of shorter strings with the same property. This is called recursive thinking.

Set of all bit strings of form 0 ----K-1 bits without 11

Set of all bit strings of form 10 --k-2 bits without 11

A Recurrence Relation: 1) s0 = 1 s1 = 2 2) s k = sk-1 + sk-2 If we calculate this recurrence to s10, we get 144

Example 2
Consider the following recurrence relation: a0 = 1 a k = a k-1 + 2 Problem: What is the formula for this recurrence relation? We start by substituting k-1 into the recursive definition above to obtain: ak-1 = ak-2 +2 What we are looking for is a lot of different representations of the formula for ak based on previous elements in the sequence: a k = a k-1 + 2 = (a k-2 + 2) + 2 = a k-2 + 4

we substitute k-2 for k in the original equation to get a k-2 = a k-3 + 2 and then substitute this value for a k-2 in the previous expression:
a k = a k-2 + 4 = (a k-3 + 2) + 4 = ak-3 + 6 a k = a k-3 + 6 = (a k-4 + 2) + 6 = ak-4 + 8 a k = a k-4 + 8....

Proof: Let P(n) denote the following: if a1, a2, a3, ..., an is the sequence defined by:

A0=1 A k = ak-1 + 2 then, an = 1 + 2n for all n >= 1


Base Case: prove that P(1) is true. Using the definition of the recurrence relation, we have: a1 = a0 + 2 = 1 + 2 = 3. Now, we show that we get the same answer using the formula: an = 1 + 2n for n = 1. an = 1 + 2n = 1 + 2 = 3. Therefore, the base case holds. Inductive Case: The induction hypothesis is to assume P(k): ak = 1 + 2k. We then need to show that P(k+1) = ak+1 = 1 + 2(k + 1). ak+1 = a((k+1)1) + 2 ak+1 = ak + 2 ak+1 = 1 + 2k + 2 ak+1 = 1 + 2(k + 1) from the recurrence relation (this is a "given") algebra substitution from inductive hypothesis (ak = 1 + 2k) algebra

Example 3

Enumerating the first few values of this sequence yields: K2 = K1 + 1 = 0 + 1 = 1 K3 = K2 + 2 = 1 + 2 = 3 K4 = K3 + 3 = 3 + 3 = 6 K5 = K4 + 4 = 6 + 4 = 10 From the recurrence relation, we can see that this sequence may be given by the formula (for all n >= 1): n-1 Kn = 0 + 1 + 2 + ... + (n 1) = S i = n(n 1)/2 i=0

Example 4
Tower of Hanoi

The object of the Towers of Hanoi is to move all the disks from the first peg to the third peg while always following these rules: 1. Only one disk may be moved at a time (specifically, the top disk on any peg). 2. At no time can a larger disk be placed on a smaller one.

The n 1 disks on the auxiliary peg are moved to the ending peg, requiring an additional Hn-1 moves. This gives us the recurrence relation:

Hn = 2Hn-1 + 1
The base case for this relations is H 0 = 0, since if there are no disks, there are no moves required to have all the disks on the third peg. To determine a closed form formula where there is no H n on the right side of the equation, we will try repeated substitution:

Hn

= 2Hn-1 + 1 = 2(2Hn-2 + 1) + 1 = 22 * Hn-2 + 2 + 1 = 22 (2Hn-3 + 1) + 2 + 1 = 23 * Hn-3 + 22 + 2 + 1 = (2i * Hn-i) + 2i-1 + 2i-2 + ... + 22 + 21 + 20

The pattern to this recurrence relation is:

Hn

We can set n = i (this will require an inductive proof):

Hn
Hn

= (2n * H0) + 2n-1 + 2n-2 + ... + 22 + 21 + 1


= 2n-1 + 2n-2 + ... + 22 + 21 + 1

Substituting H0 = 0, yields the formula:

Example 5
Find a closed form solution for the following recurrence relations. For brevity, we omit the inductive proofs. a) Recurrence relation: a0 = 2 an = 3an-1

Solution: an = 3an-1 an = 3 (3an-2) = 32 an-2 an = 32 (3an-3) = 33 an-3 an = 3i-1 (3an-i) = 3i an-I Set i = n, yielding: an = 3n a0 an = 2 * 3n

b)

Recurrence relation: a0 = 1 an = 2an-1 1

Solution: an = 2(2an-2 1) 1 = 22 an-2 2 1 an = 22 (2an-3 1) 2 1 = 23 an-3 22 2 1 an = 2i (an-i) 2i-1 2i-2 20 Set i = n, yielding: an = 2n (a0) 2n-1 2n-2 20 = 2n 2n-1 2n-2 20 an = 2n (2n 1) (Note: 2n-1 2n-2 1 = 2n 1) an = 1 c) Recurrence relation: a0 = 5 an = nan-1

Solution: an = n(n-1)an-2 an = n(n-1)(n-2)an-3 an = n(n-1)(n-2)(n-i+1)an-I Set i = n, yielding: an = n(n-1)(n-2)(1) a0 = a0 n! an = 5n!

A recurrence relation expresses the running time of a recursive algorithm. This includes how many recursive calls are generated at each level of the recursion, how much of the problem is solved by each recursive call, and how much work is done at each level.

Example 6
Consider the following function to compute factorials: int factorial(int n) { 1) if (n <= 1) 2) return(1); else 3) return(n * factorial(n-)); }

Recurrence relation:
base case: T(1) = a induction: T(n) = b + T(n-1), for n > 1 Closed form formula: T(1) = a T(2) = b + T(1) = b + a T(3) = b + T(2) = b + (b + a) = a + 2b T(4) = b + T(3) = b + (a + 2b) = a + 3b T(n) = a + (n-1)b for all n >= 1

Proof:
Let P(n) denote that the running time of factorial is T(n) = a + (n-1)b. And, recall that the recurrence relation definition give us: T(1) = a, and T(n) = b + T(n-1) for n > 1. Base case: prove that P(1) is true T(1) = a + ((1 1) * b) = a + (0 * b) = a. This equation holds since the base case of our inductive definition states that T(1) = a. Inductive case: inductive hypothesis is to assume P(k): T(k) = a + (k-1)b, and show P(k+1): T(k+1) = a + kb. We know from the recurrence relation definition that T(k+1) = b + T(k). We use the inductive hypothesis to substitute for T(k). This gives us: T(k+1) = b + T(k) = b + a + (k-1)b = b + a + kb - b = a + kb P(k+1) holds when P(k) holds, and P(1) holds, therefore P(n) is true for all n >= 1.

Example 7
Consider the following recursive selection sort algorithm: void SelectionSort(int A[], int i, int n) { int j, small, temp; 1) if (i < n) { 2) small = i; 3) for (j = i + 1, j <= n, j++) { 4) if (A[j] < A[small]) small = j; } 6) temp = A[small]; 7) A[small] = A[i]; 8) A[i] = temp; 9) SelectionSort(A, i + 1, n); } }

The inductive definition of recursive SelectionSort is: T(1) = a T(m) = T(m-1) + O(m) To solve this recurrence relation, we first get rid of the big-Oh expression by substituting the definition of big-Oh: (f(n) = O(g(n)) if f(n) <= C * g(n)), so we can substitute C*m for O(m): T(1) = a T(m) = T(m-1) + C*m Now, we can either try repeated substitutions or just enumerate a few cases to look for a pattern. Lets try repeated substitution: T(m) = T(m-1) + C*m = T(m-2) + 2Cm - C = T(m-3) + 3Cm - 3C = T(m-4) + 4Cm - 6C = T(m-5) + 5Cm - 10C ... = T(m-j) + jCm - (j(j-1)/2)C

because T(m-1) = T(m-2) + C(m-1) because T(m-2) = T(m-3) + C(m-2) because T(m-3) = T(m-4) + C(m-3) because T(m-4) = T(m-5) + C(m-4)

To get a closed form formula we let j = m 1 T(m) = T(1) + (m-1)Cm - ((m-1)(m-2)/2)C = a + m2C - Cm - (m2C - 3Cm + 2C)/2 = a + (2m2C - 2Cm - m2C + 3Cm - 2C)/2 = a + (m2C + Cm - 2C)/2

Closed form formula T(m) = a + (m2C + Cm - 2C)/2

Example 8
MergeSort(L)

1)

if (length of L > 1) { 2) 3) 4) 5) }

Split list into first half and second half MergeSort(first half) MergeSort(second half) Merge first half and second half into sorted list

To analyze the complexity of MergeSort, we need to define a recurrence relation. The base case is when we have a list of 1 element and only line 1 of the function is executed. Thus, the base case is constant time, O(1). If the test of line 1 fails, we must execute lines 2-5. The time spent in this function when the length of the list > 1 is the sum of the following:

1) O(1) for the test on line 1 2) O(n) for the split function call on line 2 3) T(n/2) for recursive call on line 3 4) T(n/2) for recursive call on line 4 5) O(n) for the merge function call on line 5
If we drop the O(1)'s and apply the summation rule, we get 2T(n/2) + O(n). If we substitute constants in place of the bigOh notation we obtain: T(1) = a T(n) = 2T(n/2) + bn To solve this recurrence relation, we will enumerate a few values. We will stick to n's that are powers of two so things divide evenly: T(2) T(4) T(8) T(16) = 2T(1) + 2b = 2T(2) + 4b = 2(2a + 2b) + 4b = 2T(4) + 8b = 2(4a + 8b) + 8b = 2T(8) + 16b = 2(8a + 24b) + 16b = = = = 2a + 2b 4a + 8b 8a + 24b 16a + 64b

There is obviously a pattern but it is not as easy to represent as the others have been. Note the following relationships:

value of n: coefficient of b: ratio:

2 2 1

4 8 2

8 24 3

16 64 4

So, it appears that the coefficient of b is n times another factor that grows by 1 each time n doubles. The ratio is log2 n because log2 2 = 1, log2 4 = 2, log2 8 = 3, etc. Our "guess" for the solution of this recurrence relation is T(n) = an + bn log2 n. We could have used repeated substitution in which case we would have the following formula: T(n) = 2i T(n/2i) + ibn Now, if we let i = log2 n, we end up with: n*T(1) + bn log2 n = an + bn log2 n (because 2log2n = n).

Theorem 1 (Master Theorem):

Let f be an increasing function that satisfies the recurrence relation:


f(n) = a f(n/b) + cnd whenever n = bk, where k is a positive integer, a >= 1, b is an integer greater than 1, c is a positive real number, and d is a non-negative real number. Then: O(nd) O(nd log n) O(n log a) if a < bd if a = bd if a > bd

f(n)

Example 9
Problem: Use Theorem 1 above to find the big-Oh running time of MergeSort (from Example 9). Solution: In Example 9, we are given the recurrence relation for MergeSort as: T(n) = 2T(n/2) + xn where we have simply replaced the positive constant b in the original recurrence relation with the constant x, so that we do not confuse variable names below. Applying Theorem 1, we choose these constants: a = 2, b = 2, c = x, and d = 1 and it becomes: f(n) = 2 f(n/2) + xn1

Consider the three cases of the Master Theorem, which are based on the relationship between a and bd. Here are some examples to consider:

1. a < bd: f(n) = 2 f(n/2) + xn2. In this example, 2 < 22, and thus f(n) = O(n2). The xn2 term is growing faster than the 2 f(n/2) and thus dominates. 2. a = bd: f(n) = 2 f(n/2) + xn1. In this example, 2 = 2, and thus f(n) = O(n log n). The two terms grow together. 3. a > bd: f(n) = 8 f(n/2) + xn2. In this example, 8 < 22, and thus f(n) = O(nlog 8) = O(n3). The 8 f(n/2) term is growing faster than the xn2 and thus dominates.

Example 10
Problem: Use repeated substitution to find the time complexity of the function recurse. Verify your result using induction. /* Assume only non-negative even values of n are passed in */ void recurse(int n) { int i, total = 0; if (n == 0) return 1; for (i = 0; i < 4; i++) { total += recurse(n-2); } return total; }

1) 2) 3) 4)

Solution: When n = 0 (the base case), this function only executes line 1, doing O(1) work. We will note this constant amount of work as a. When n >= 2 (the recursive case), this function does O(1) work in executing lines 1, 2, and 4, which we will denote by b, and also makes four recursive calls with parameter values n-2 in the body loop (line 3).

We use these two points to perform the recurrence relation analysis below:

From the base case, we have: T0 = a From the recursive case, we have: Tn = 4Tn-2+ b

We apply repeated substitution:


T n = 4Tn-2 + b T n = 4(4Tn-4 + b) + b = 42 Tn-4 + 4b + b T n = 42 (4Tn-6 + b) + 4b + b = 43 Tn-6 + 42 b + 4b + b T n = 4i Tn-2i + 4i-1 b + 4i-2 b + + 40 b

Note that 4k = 22k, yielding:


Tn = 22i Tn-2i + 22(i-1) b + 2 2(i-2) b + + 20 b Set i = n/2, yielding: Tn = 2n T0 + 2n-2 b + 2n-4 b + + 20 b = 2n a + (2n-2 + 2n-4 + + 20) b

We need to compute the closed form for (2n-2 + 2n-4 + + 20) = (2n 1)/3. Here is a brief digression on how we compute this closed form formula: Let x = (2n-2 + 2n-4 + + 20) So, 22 x = 4x = 22 (2n-2 + 2n-4 + + 20) = (2n + 2n-2 + + 22) 22 x x = 3x = (2n + 2n-2 + + 22) (2n-2 + 2n-4 + + 20) 3x = 2n 20 = 2n 1 x = (2n 1)/3 = (2n-2 + 2n-4 + + 20)

Base case: Show P(0).T0 = 20 a + ((20 1)/3) b = a + ((1 1)/3) b = a. This is the initial condition.

Inductive case: Assume inductive hypothesis P(n): Tn = 2n a + ((2n 1)/3) b Show P(n+2): Tn+2 = 2n+2 a + ((2n+2 1)/3) b Tn+2 = 4Tn + b Tn+2 = 4(2n a + ((2n 1)/3) b) + b Tn+2 = 22 (2n a + ((2n 1)/3) b) + b Tn+2 = 2n+2 a + ((2n+2 22)/3) b + b Tn+2 = 2n+2 a + ((2n+2 3 1)/3) b + b Tn+2 = 2n+2 a + ((2n+2 1)/3) b b + b Tn+2 = 2n+2 a + ((2n+2 1)/3) b QED. Note: only consider even values of n by definition of the recurrence substituting T n with the inductive hypothesis algebra: 4 = 22 algebra: multiply in the 22 algebra: -22 = -3 - 1 algebra: (-3/3)b = -b this is the desired result

We calculate the big-Oh complexity for the function, as follows: From the recurrence relation Tn = 2n a + ((2n 1)/3) b, we see that the algorithm has time complexity: O(2n a) + O(2n b) = O(2n) + O(2n) = O(2n).

You might also like