Professional Documents
Culture Documents
Orders of Growth
• Why this emphasis on the count’s order of
growth for large input sizes?
• Values (some approximate) of several functions
important for analysis of algorithms
Order of growth : comparing the
magnitude
• The function growing slowest is logarithmic
function
• the exponential function 2n and the factorial
function n! grow so fast that their values become
astronomically large even for rather small values
of n.
▫ For example: 4 . 1010 years for a computer making
a trillion (1012) operations per second to execute
2100 operations.
▫ it is still longer than 4.5 billion (4.5 . 109) years—
the estimated age of the planet Earth.
• 2n and n! : are often referred to as
“exponential-growth functions”
Order of growth with two fold increase
of n
• The function log n increases in value by just 1
▫ (because log 2n = log 2 + log n = 1+ log2 n);
• the linear function increases twofold,
• the linearithmic function n log2 n increases slightly
more than twofold;
• the quadratic function n2 and cubic function n3
increase fourfold and eightfold, respectively
▫ (because (2n)2 = 4n2 and (2n)3 = 8n3);
• and n! increases much more than that (yes, even
+mathematics refuses to cooperate to give a neat
answer for n!)
Classes of Algorithmic Efficiency
based on: Levitin, Anany, The Design and Analysis of Algorithms, Addison-Wesley, 2007
Comparing order of growth using limits
Example
Asymptotic Complexity
• Two important reasons to determine operation and
step counts
1. To compare the time complexities of two programs that
compute the same function
2. To predict the growth in run time as the instance
characteristic changes+
Break-even point
• Big 0
▫ Suppose we have a function of n
g(n)
that we suggest gives us an upper bound on the
worst case behavior of our algorithm’s runtime –
which we have determined to be f(n)
then…
• Big 0
▫ We describe the upper bound on the growth
of our run-time function f(n) –
▫ f(n) is O(g(n))
f(n) is bounded from above by g(n) for all
significant values of n
if
f(n) <= cg(n) for all n >= n0
from: Levitin, Anany, The Design and Analysis of Algorithms, Addison-Wesley, 2007
• Big 0
▫ …but be careful
▫ f(n) = O(g(n)) is incorrect
▫ the proper term is
▫ f(n) is O(g(n))
▫ to be absolutely correct
▫ f(n) Є O(g(n))
• Big Ω
▫ Big Omega
▫ our function is bounded from below by g(n)
▫ that is,
▫ f(n) is Ω(g(n))
if there exists some positive constant c
such that f(n) >= cg(n), n >= n0
from: Levitin, Anany, The Design and Analysis of Algorithms, Addison-Wesley, 2007
• Big Θ
▫ Big Theta
▫ our function is bounded from above and
below by g(n)
▫ that is,
▫ f(n) is Θ(g(n))
if there exists two positive constants c1 and c2
such that c2g(n) <= f(n) >= c1g(n) for all n >= n0
• or is it that
▫ T(n) is O(n6) (true or false)
• True
• but it is considered bad form
• why?
Theorem
• f(n) is polynomial of degree d, the f(n) is O(nd)
• If t1(n) O(g1(n)) and t2(n) O(g2(n)), then
t1(n) + t2(n) O(max{g1(n), g2(n)}).
▫ The analogous assertions are true for the -notation and -notation.
• Implication: The algorithm’s overall efficiency will be determined by the part with a
larger order of growth, i.e., its least efficient part.
• f(n) O(f(n))
Size: n
Basic operation: multiplication
Recurrence relation: M(n) = M(n-1) + 1
M(0) = 0
Solving the recurrence for M(n)
M(n) = M(n-1) + 1, M(0) = 0
M(n) = M(n-1) + 1
= (M(n-2) + 1) + 1 = M(n-2) + 2
= (M(n-3) + 1) + 2 = M(n-3) + 3
…
= M(n-i) + i
= M(0) + n
=n
The method is called backward substitution.
Example 2: The Tower of Hanoi Puzzle
M(n) = 2M(n-1) + 1
= 2(2M(n-2) + 1) + 1 = 2^2*M(n-2) + 2^1 + 2^0
= 2^2*(2M(n-3) + 1) + 2^1 + 2^0
= 2^3*M(n-3) + 2^2 + 2^1 + 2^0
=…
= 2^(n-1)*M(1) + 2^(n-2) + … + 2^1 + 2^0
= 2^(n-1) + 2^(n-2) + … + 2^1 + 2^0
= 2^n -1
Binary Search
• Binary search is a remarkably efficient algorithm for
searching in a sorted array.
• It works by comparing a search key K with the
array’s middle element A[m].
• If they match, the algorithm stops; otherwise, the
same operation is repeated recursively for the first
half of the array if K <A[m], and for the second half
if K >A[m]:
Analysis
Example 3: Counting #bits
A(n) = A( n / 2 ) + 1, A(1) = 0