You are on page 1of 17

Analysis of Algorithms

CSCI 3110

Previous Evaluations of
Programs
Correctness does the algorithm do what it
is supposed to do?
Generality does it solve only a specific
case or will it work for the general case too?
Robustness will the algorithm recover
following an error
Portability
Documentation & Style
Efficiency Does it accomplish its goals with
a minimal usage of computer resources (time
and space)?

New Way to Evaluate


Memory efficiency: which algorithm
requires less memory space during run
time?
Time efficiency: which algorithm requires
less computational time to complete?
When comparing the algorithms in terms of
efficiency, we consider the amount of
memory and time required as a function
of the size of the input. Thus we can
measure the rate of growth of the memory
use or time use using that function.

What is the size of the


input?
Sorting a set of numbers the
number of items to be sorted
Search for an item in a set of items
the number of items to search from
Computation involved with a n x n
matrix size of the matrix
A list of strings (numStrings x string
Length)

Time Complexity
Best time
The minimum amount of time required by the
algorithm for any input of size n.
Seldom of interest.

Worst time our focus


The maximum amount of time required by the
algorithm for any input of size n

Average time
The average amount of time required by the
algorithm over all inputs of size n

Similar definitions can be given for space


complexity

Empirical vs. theoretical


analysis
Empirical approach: Measure Computer
Clock Time
problem difference in computer platforms,
compiler, language, programmers
Machine instructions

Theoretical approach: Count number of


C++ statements executed during run time
or obtain an acceptable approximation.
Therefore, the focus of this type of analysis is
performing frequency counts : a count of how
many times a statement is executed in an
algorithm.

Look at Counting Examples

Algorithm Efficiency
The efficiency of an algorithm is
determined in terms of the order of
growth of the function.
Only compare the rate of growth (order of
magnitude) of the functions and compare
the orders. -- how fast does the algorithm
grow as the size of input, N, grows?
Normally, the growth function will approach
some simpler function asymptotically.
For growth functions having different orders
of magnitude, exact frequency counts and
the constants become insignificant.

The algorithms are grouped into groups according to the


order of growth of the their time complexity functions,
which include:
O(1) < O(logN) < O(N) < O(NlogN) < O(N2) < O(N3) <
< O(eN)
look at the graph with all these functions plotted against
each other

Determining the Order of


Growth
How does one determine which Big O category an
algorithms growth rate function (GRF) falls into?
This is determined by using the big O notation:
Rules of Big O notations that help to
simplify the analysis of an algorithm
Ignore the low order terms in an algorithms
growth rate function
Ignore the multiplicative constants in the high
order terms of GRF
When combining GRF, O(f(N))+O(g(N)) = O(f(N)
+g(N))
(e.g., when two segments of code are analyzed
separately and we want to know the time complexity
of the total of the two segments

Examples

F(n)
F(n)
F(n)
F(n)

=
=
=
=

10n + 3n3 + 12 is O(?)


20nlogn + 3n + 2 is O(?)
12nlogn + 2n2 is O(?)
2n + 3n is O(?)

Formal Definition of Big


O
Formal Definition of Big O function:

F(n)= O(g(n)) (F(n) is Big O of g(n)) if there


is are positive constant values C and n 0,
such that:
f(n) <= C*g(n) for all n>= n0, where n0 is
non-negative integer
Note: Some definitions of Big O read that |f(N)|
<= C*|g(N)| for all N>= N0, where N0 is nonnegative integer.
However, we will use the relaxed version of the
definition since we are dealing with frequency
counts which should be positive anyway.

Example
Show by definition that f(n) = 3n + 2 is
O(n).
By definition, we must show there exist C
and n0 such that 3n+2 < C n for all n > n 0
where n0 is a non-negative integer.
Let C = 4. Why did I pick 4?
If C = 4, 3n+2 < 4n when? Subtract 3n from
both sides of the inequality: 3n + 2 3n < 4n
3n or 2 < n. Thus n0 is 2.
Is this the only C and n0 that will work?

Intuitive interpretation of
growth functions
A GFR of O(1) implies a problem
whose time requirement is constant &
thus independent of the problems size n.

A GFR of O(log2 n) implies a problem


For which the time requirement increases
slowly as the problem size increases.
If you square the problem size, you only
double its time requirement.
Often occurs in an algorithm that solves a
problem by solving a smaller constant
fraction of the problem (like the binary
search)

Intuitive interpretation
continued
A GFR of O(1) implies a problem
For which the time requirement is directly
proportional to the size of the problem.
In which if you square the problem size,
you also square its time requirement
That often has a single loop from 1 to n

A GFR of O(n log2 n) implies a problem


For which the time requirement increases
more rapidly than a linear algorithm.
That usually divides a problem into smaller
problems that are each solved separately

Intuitive interpretation
continued
A GFR of O(n2) is a problem
For which the time requirement increases
rapidly with the size of the problem
That often has two nested loops

A GFR of O(n3) is a problem


That often has three nested loops
that is only practical for small problems

A GFR of O(2n) is a problem


For which the time requirements are
exponential
For which the time requirements increase too
rapidly to be practical

Read Keeping Your Perspective


pages 454-456 in the book.

You might also like