Professional Documents
Culture Documents
Fall 2005
Introduction to
Analysis of Algorithms
Introduction
What is Algorithm?
May be specified
In
English
As a computer program
As a pseudo-code
Data structures
Introduction
Why
An Example
A city has n stops. A bus driver wishes to follow the shortest
path from one stop to another. Between very two stops, if a
road exists, it may take a different time from other roads. Also,
roads are one-way, i.e., the road from view point 1 to 2, is
different from that from view point 2 to 1.
# of paths
(n/e)n
2:
What
Algorithm Analysis
We only analyze correct algorithms
An algorithm is correct
Incorrect algorithms
If, for every input instance, it halts with the correct output
Might not halt at all on some input instances
Might halt with other than the desired answer
Analyzing an algorithm
bandwidth
Computational time (usually most important)
Algorithm Analysis
computer
compiler
algorithm used
input to the algorithm
The
E.g.
Running-time of algorithms
Bounds
Bounds
V = a,
W= b
While W > 1
V V + a; W W-1
Output V
Growth Rate
= O(g(N))
There are positive constants c and n0 such
that
f(N) c g(N) when N n0
The
Big-Oh: example
Let
f(N) = O(N4)
f(N) = O(N3)
f(N) = O(N2) (best answer, asymptotically tight)
N2 / 2 3N = O(N2)
1 + 4N = O(N)
7N2 + 10N + 3 = O(N2) = O(N3)
log10 N = log2 N / log2 10 = O(log2 N) = O(log N)
sin N = O(1); 10 = O(1), 1010 = O(1)
i 1
i N N O( N 2 )
2
2
3
i
O
(
N
)
i1
N
log N + N = O(N)
N = O(2N), but 2N is not O(N)
210N is not O(2N)
Some rules
When considering the growth rate of a function using
Big-Oh
Ignore the lower order terms and the coefficients of
the highest-order term
No need to specify the base of logarithm
Big-Omega
Big-Omega
= (g(N))
There are positive constants c and n0 such
that
f(N) c g(N) when N n0
f(N)
The
Big-Omega: examples
Let
f(N) = (N)
f(N) = (N2)
(best answer)
f(N) = (g(N))
Big-Theta
= (g(N)) iff
f(N) = O(g(N)) and f(N) = (g(N))
The growth rate of f(N) equals the growth rate
of g(N)
Example: Let f(N)=N2 , g(N)=2N2
f(N)
Some rules
If
For
logarithmic functions,
T(logm N) = (log N).
Little-oh
f(N)
= o(g(N))
f(N) = O(g(N)) and f(N) (g(N))
The growth rate of f(N) is less than the growth
rate of g(N)
then
f (N )
lim
=
n g ( N )
f ( N )
lim
n g ( N )
lim
g(N )
if 0:
f(N) = o(g(N))
if constant 0: f(N) = (g(N))
if :
g(N) = o(f(N))
limit oscillates: no relation
Example Functions
sqrt(n) , n, 2n, ln n, exp(n), n + sqrt(n) , n + n2
limn sqrt(n) /n = 0,
sqrt(n) is o(n)
n is o(sqrt(n))
n is (2n), (2n)
2n is (n), (n)
Growth rates
Doubling
f(N) = c
f(N) = log N
f(N) = N
f(N) = N2
f(N) = N3
f(N) = 2N
Advantages
f(2N) = f(N) = c
f(2N) = f(N) + log 2
f(2N) = 2 f(N)
f(2N) = 4 f(N)
f(2N) = 8 f(N)
f(2N) = f2(N)
of algorithm analysis
Example
Calculate
3
i
i 1
4N
2N+2
General Rules
For
loops
Nested
for loops
statements
If/Else
never more than the running time of the test plus the larger of
the running times of S1 and S2.
Another Example
Maximum
E.g.
Algorithm 1: Simple
O(N3)
Algorithm 2: Divide-and-conquer
Divide-and-conquer
Algorithm 2 (contd)
The first two cases can be solved recursively
For the last case:
find the largest sum in the first half that includes the last
element in the first half
the largest sum in the second half that includes the first
element in the second half
add these two sums together
5 2
-1
2 6 -2
Algorithm 2
O(1)
T(m/2)
T(m/2)
O(m)
O(1)
Algorithm 2 (contd)
Recurrence
equation
T (1) 1
N
T (N ) 2T ( ) N
2
Algorithm 2 (contd)
N
) N
2
N
4T ( ) 2N
4
N
8 T ( ) 3N
8
T (N ) 2T (
2k T (
N log N N
Thus, the running time is O(N log N)
N
) kN
k
2