You are on page 1of 4

CS109A Notes for Lecture 1/26/96

Running Time

A program or algorithm has a running time ( ),


where is the measure of the size of the input.
( ) is the largest amount of time the program takes on any input of size .
Example: For a sorting algorithm, we normally choose to be the number of elements to
be sorted. For Mergesort, ( ) = log ; for
Selection-sort or Quicksort, ( ) = 2.
But there is an unknowable constant factor
that depends on various factors, such as machine speed, quality of the compiler, load on
the machine.
T n

T n

T n

T n

Why Measure Running Time?


Guides our selection of an algorithm to implement.
Helps us explore for better solutions without
expensive implementation, test, and measurement.

Arguments Against Running-Time Measurement


Algorithms often perform much better on average than the running time implies (e.g.,
quicksort is log on a \random" list but
2 in the worst case, where each division separates out only 1 element).
But for most algorithms, the worst case
is a good predictor of typical behavior.
When average and worst cases are radically di erent, we can do an average-case
analysis.
Who cares? In a few years machines will be
so fast that even bad algorithms will be fast.
n

The faster computers get, the more we


nd to do with them and the larger the
size of the problems we try to solve.
Asymptotic behavior (growth rate) of the
running time becomes more important,
not less, because we are getting closer to
the asymptote.
Constant factors hidden by \big-oh" are more
important than the growth rate of running
time.
Only for small instances, and anything is
OK when your input is small.
Benchmarking (running program on a popular set of test cases) is easier.
Sometimes true, but you've committed
yourself to an implementation already.

Big-Oh

A notation to let us ignore the unknowable


constant factors and focus on growth rate of
the running time.
Say ( ) is
( ) if for \large" , ( ) is no
more than proportional to ( ).
More formally: there exist constants 0 and
0 such that for all
0 we have ( )
( ).
0 and are called witnesses to the fact that
( ) is
( ).
Example: 10 2 + 50 + 100 is ( 2 ). Pick witnesses 0 = 1 and = 160. Then for any
1,
10 2 + 50 + 100 160 2.
Other choices of witness are possible, e.g.,
( 0 = 10 = 16).
General rule: any polynomial is big-oh of its
leading term with coe cient of 1.
T n

O f n

T n

f n

c >

T n

cf n

T n

O f n
n

O n

; c

Example:

is (2n).
Note that 10 can be very large compared to
2n for \small" .
10
2n is the same as saying
10 log2
. (False for = 32; true
for = 64.)
Pick witnesses 0 = 64 and = 1. For
64
we have 10 1 2n.
n

10

<

Growth Rates of Common Functions

The base of a logarithm doesn't matter.


loga is (logb ) for any bases and because loga = loga logb (i.e., witnesses are
0 = 1 = log a ).
Thus, we omit the base when talking
about big-oh.
Logarithms grow slower than any power of ,
e.g. log is ( 1=10 ).
An exponential is n for some constant 1.
Polynomials grow slower than any exponential, e.g. 10 is (1 001n).
Generally, exponential running times are impossibly slow; polynomial running times are
tolerable.
n

;c

O n

c >

Proofs That a Big-oh Relationship is False


Example: 3 is not ( 2 ). In proof: suppose
n

O n

it were. Then there would be witnesses 0 and


3
2.
such that for all
0 we have
Choose 1 to be
1. At least as large as 0 .
2. At least as large as 2 .
3
2 holds for
= 1, because 1 0
by (1).
2 , then
If 3
.
1
1
1
But by (2), 1 2 .
n

cn

cn

cn

Since 0 (holds for any witness ), it is not


possible that 2
.
1
Thus, our assumption that we could nd witnesses 0 and was wrong, and we conclude
3 is not ( 2 ).
c >

O n

General Idea of Non-Big-Oh Proofs


1.
2.
3.

Template p. 101 of FCS.


Assume witnesses 0 and exist.
Select 1 in terms of 0 and .
Show that 1 0, so the inequality ( )
( ) must hold for = 1.
Show that for the particular 1 chosen,
( 1)
( 1 ).
Conclude from (3) and (4) that 0 and are
not really witnesses. Since we assumed nothing special about witnesses 0 and , we conclude that no witnesses exist, and therefore
the big-oh relationship does not hold.
n

cf n

4.

T n

T n

5.

> cf n

You might also like