Professional Documents
Culture Documents
(define (fact n)
(if (= n 0)
1
(* n (fact (- n 1)))))
6.001: Lecture 4
Orders of Growth
• We can run this for various values of n:
(fact 10)
(fact 100)
(fact 1000)
(fact 10000)
• 2100 ≈ 1030
• Measuring time requirements of a function • Suppose n is a parameter that measures the size of a problem
• Asymptotic notation • Let t(n) be the amount of time necessary to solve a problem
of size n
• Calculating the time complexity for different functions
• What do we mean by “the amount of time”?: how do we
• Measuring space requirements of a function measure “time”?
– Base case: t(n) = n is true for n = 0 • Define t(n) to be the number of additions required by
(fib n)
– Inductive case: if t(n) = n then it follows that t(n + 1) =
n+1 • By looking at fib, we can see that:
t(1) = 0
t(2) = 0
t(n) = 1 + t(n − 1) + t(n − 2) for n ≥ 3
We say t(n) has order of growth Θ(f (n)) if there are k1 × n ≤ t(n) ≤ k2 × n
constants k, k1 and k2 such that for all n ≥ k, we
have k1 f (n) ≤ t(n) ≤ k2 f (n) for all n ≥ k if we pick k = k1 = k2 = 1
Examples Motivation
• t(n) = 3n2 has order of growth Θ(n2 ), because • In many cases, calculating the precise expression for t(n) is
laborious, e.g,
k1 × n2 ≤ t(n) ≤ k2 × n2
t(n) = 5n3 + 6n2 + 8n + 7 or t(n) = 4n3 + 18n2 + 14
for all n ≥ k if we pick k = 1, and k1 = k2 = 3
• t(n) = 3n2 + 5n + 3 has order of growth Θ(n2 ), because • In both of these cases, t(n) has order of growth Θ(n3 )
for all n ≥ k if we pick k = 5, k1 = 3, and k2 = 8 – In many cases, it’s much easy to show that t(n) has a particular order
of growth (e.g., Θ(n3 )), rather than calculating a precise expression
for t(n)
– Usually, the order of growth is what we really care about: the most
important thing about the above functions is that they’re both cubic
(i.e., have order of growth Θ(n3 ))
t(n) = c1 + n × c2 • By looking at exp, we can see that t(n) has the form:
t(1) = 0
t(2) = 0
t(n) = 1 + t(n − 1) + t(n − 2) for n ≥ 3
and for n ≥ 3 we have
t(n) ≥ 2t(n − 2)
• Intuition? Every time we add β to the problem size n, the • The computation of (fib 5) then involves computing
amount of computation required is multiplied by a factor of α (fib 4) and (fib 3). At this point, (fib 4) has been
that is greater than 1 computed twice. Isn’t this wasteful?!
Why is Our Version of fib so Inefficient? The Computation Tree for (fib 7)
7
• A Computation tree: we’ll use
5 6 5
4 3
4 3 5 4
3 2 2 1
4 3 3 2
to signify that computing (fib 5) involves recursive calls
to (fib 4) and (fib 3) 2 1
3 2 2 1 2 1
2 1
• There’s a lot of repeated computation here: e.g., (fib 3) is
recomputed 5 times
An Efficient Implementation of Fibonacci
• If you trace the function, you’ll see that we avoid repeated
computations. We’ve gone from exponential growth to linear
(define (fib2 n) (fib-iter 0 1 0 n))
growth!!
(define (fib-iter i a b n) (fib2 5)
(if (= i n) (fib-iter 0 1 0 5)
b (fib-iter 1 1 1 5)
(fib-iter (+ i 1) (+ a b) a n))) (fib-iter 2 2 1 5)
(fib-iter 3 3 2 5)
(fib-iter 4 5 3 5)
• Recurrence (t(n) is number of additions): (fib-iter 5 8 5 5)
t(0) = 0 => 5
t(n) = 2 + t(n − 1) for n ≥ 1
• So far, we’ve considered the order of growth of t(n) for (define (fact n)
various functions. t(n) is the time for the procedure to run (if (= n 0)
when given an input of size n 1
(* n (fact (- n 1)))))
• Now let’s define s(n) to be the space or memory requirements
of a procedure when the problem size is n. What is the order • A trace of fact, showing that it leads to a recursive process,
of growth of s(n)? with pending operations
(fact 4)
(* 4 (fact 3))
(* 4 (* 3 (fact 2)))
(* 4 (* 3 (* 2 (fact 1))))
(* 4 (* 3 (* 2 (* 1 (fact 0)))))
(* 4 (* 3 (* 2 (* 1 1))))
(* 4 (* 3 (* 2 1)))
...
24
• In general, running (fact n) leads to n pending operations (define (ifact n) (ifact-helper 1 1 n))
• Each pending operation takes a constant amount of memory (define (ifact-helper product counter n)
(if (> counter n)
• In this case, s(n) has order of growth Θ(n) product
i.e., linear growth in space (ifact-helper (* product counter)
(+ counter 1)
n)))
A Contrast: Iterative Factorial Summary
• A trace of (ifact 4):
• We’ve describe how to calculate t(n), the time complexity of
(ifact 4) a procedure as a function of the size of its input
(ifact-helper 1 1 4)
(ifact-helper 1 2 4) • We’ve introduced asymptotic notation for orders of growth
(ifact-helper 2 3 4) (e.g., Θ(n), Θ(n2 ))
(ifact-helper 6 4 4)
(ifact-helper 24 5 4) • There is a huge difference between exponential order of
24 growth and non-exponential growth (e.g., if your procedure
t(n) = Θ(2n ), you will not be able to run it for large values of
• (ifact n) has no pending operations, so s(n) has an order n)
of growth that is Θ(1). Its time complexity t(n) is Θ(n)
• We’ve given examples of functions with linear, logarithmic,
• In contrast, (fact n) has t(n) = Θ(n) and s(n) = Θ(n), and exponential growth for t(n). Main point: you should
i.e., linear growth in both space and time be able to work out the order of growth of t(n) for simple
procedures in scheme
• In general, iterative processes often have a lower order of
growth for s(n) than recursive processes