You are on page 1of 36

Analysis of Algorithm

Asymptotic Analysis

What is the order of growth?


In the running time expression, when n becomes large a term will become significantly larger than the other ones: this is the so-called dominant term
T1(n)=an+b T2(n)=a log n+b T3(n)=a n2+bn+c T4(n)=an+b n +c (a>1)

Dominant term: a n Dominant term: a log n Dominant term: a n2 Dominant term: an

What is the order of growth?


T1(kn)= a kn=k T1(n) T2(kn)=a log(kn)=T2(n)+alog k T3(kn)=a (kn)2=k2 T3(n) T4(n)=akn=(an)k Order of growth Linear Logarithmic Quadratic

Exponential

How can be interpreted the order of growth?


Between two algorithms it is considered that the one having a smaller order of growth is more efficient However, this is true only for large enough input sizes Example. Let us consider T1(n)=10n+10 (linear order of growth) T2(n)=n2 (quadratic order of growth) If n<=10 then T1(n)>T2(n) Thus the order of growth is relevant only for n>10

Growth Rate
Growth Rate of Diferent Functions
300 250
Function Value
lg n n lg n n square n cube 2 raise to power n

200 150 100 50 0


Data Size

Growth Rates
n 0 1 2 4 8 16 32 64 128 256 512 1024 2048 lgn #NUM! 0 1 2 3 4 5 6 7 8 9 10 11 nlgn #NUM! 0 2 8 24 64 160 384 896 2048 4608 10240 22528 n2 0 1 4 16 64 256 1024 4096 16384 65536 262144 1048576 4194304 n3 0 1 8 64 512 4096 32768 262144 2097152 16777216 1.34E+08 1.07E+09 8.59E+09 2n 1 2 4 16 256 65536 4.29E+09 1.84E+19 3.4E+38 1.16E+77 1.3E+154

Constant Factors
The growth rate is not affected by
constant factors or lower-order terms

Examples
102n + 105 is a linear function 105n2 + 108n is a quadratic function

Time Complexities
Time complexity depends on
Input size Basic operation

Concept of Basic Operation


Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size Basic operation: the operation that contributes most towards the running time of the algorithm.
As a rule, the basic operation is located in its inner-most loop

Basic Operation in sorting?

Every case Time Complexity


For a given algorithm, T(n) is every case time complexity if algorithm have to repeat its basic operation every time for given input size n. determination of T(n) is called every case time complexity analysis.

Every case time Complexity(examples)


Sum of elements of array Algorithm sum_array(A,n) sum=0 for i=1 to n sum+=A[i] return sum

Every case time Complexity(examples)


Basic operation sum(adding up elements of array Repeated how many number of times??
For every element of array

Complexity T(n)=O(n)

Every case time Complexity(examples)


Exchange Sort Algorithm exchange_sort(A,n) for i=1 to n-1 for j= i+1 to n if A[i]>A[j] exchange A[i] &A[j]

Basic operation comparison of array elements Repeated how many number of times?? Complexity T(n)=n(n-1)/2=O(n^2)

Best Case Time Complexity


For a given algorithm, B(n) is best case time complexity if algorithm have to repeat its basic operation for minimum time for given input size n. determination of B(n) is called Best case time complexity analysis.

Best Case Time Complexity (Example)


Algorithm sequential_search(A,n,key) i=0 while i<n && A[i]!= key i=i+1 If i<n return i Else return-1

Input size: number of elements in the array i.e. n Basic operation :comparison of key with array elements Best case: first element is the required key

Worst Case Time Complexity


For a given algorithm, W(n) is worst case time complexity if algorithm have to repeat its basic operation for maximum number of times for given input size n. determination of W(n) is called worst case time complexity analysis.

Sequential Search
Input size: number of elements in the array i.e. n Basic operation :comparison of key with array elements worst case: last element is the required key or key is not present in array at all Complexity :w(n)=O(n)

Average Case Time Complexity


For a given algorithm, A(n) is average case time complexity if algorithm have to repeat its basic operation for average number of times for given input size n. determination of A(n) is called average case time complexity analysis.

Input size: number of elements in the array i.e. n Basic operation :comparison of key with array elements

Average case: probability of successful search is p(0<=p<=1) Probability of each element is p/n multiplied with number of comparisons ie p/n*i A(n)=[1.p/n+2.p/n+..n.p/n]+n.(1-p) p/n(1+2+3++n)+n.(1-p) P(n+1)/2 +n(1-p)

Asymptotic Notations

Order Notation
There may be a situation, e.g. g(n) f(n) Tt

n0

f(n) <= g(n) f(n) <= cg(n)

for all n >= n0

Or

for all n >= n0 and c = 1

g(n) is an asymptotic upper bound on f(n). f(n) = O(g(n)) iff there exist two positive constants c and n0 such that f(n) <= cg(n) for all n >= n0

Big-Oh Notation
Given functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constants c and n0 such that
f(n) cg(n) for n n0 Example: 2n + 10 is O(n)
2n + 10 cn Pick c = 3 and n0 = 10
10,000

1,000

3n

2n+10

100

10

1 1 10 100 1,000

Big-Oh and Growth Rate


The big-Oh notation gives an upper bound on the growth rate of a function The statement f(n) is O(g(n)) means that the growth rate of f(n) is no more than the growth rate of g(n) We can use the big-Oh notation to rank functions according to their growth rate
f(n) is O(g(n)) g(n) is O(f(n))

g(n) grows more f(n) grows more Same growth

Yes No Yes

No Yes Yes

Big-Oh Example
Example: the function n2 is not O(n)
cn nc The above inequality cannot be satisfied since c must be a constant n2
1,000,000

n^2
100,000

100n

10,000

10n

1,000

100

10

1 1 10 100 1,000

More Big-Oh Examples


7n-2 7n-2 is O(n) need c > 0 and n0 1 such that 7n-2 cn for n n0 this is true for c = 7 and n0 = 1

3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3) need c > 0 and n0 1 such that 3n3 + 20n2 + 5 cn3 for n n0 this is true for c = 28 and n0 = 1

3 log n + log log n


3 log n + log log n is O(log n) need c > 0 and n0 1 such that 3 log n + log log n clog n for n n0 this is true for c = 4 and n0 = 2

Big-Oh Rules
If is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,
1. Drop lower-order terms 2. Drop constant factors

Use the smallest possible class of functions


Say 2n is O(n) instead of 2n is O(n2)

Use the simplest expression of the class


Say 3n + 5 is O(n) instead of 3n + 5 is O(3n)

Order Notation
Asymptotic Lower Bound: f(n) = (g(n)),
iff there exit positive constants c and n0 such that f(n) >= cg(n) for all n >= n0

f(n)

g(n)

n0

Big-Omega Notation
Given functions f(n) and g(n), we say that f(n) is (g(n)) if there are positive constants c and n0 such that
f(n) <=cg(n) for n n0 Example: 2n ^2+ 10 is (n)
2n^2 + 10 >=cn (2n^2+10/n)>=c 2n+10/n>=c Pick c = 1 and n0 = 1

Order Notation
Asymptotically Tight Bound: f(n) = (g(n)),
iff there exit positive constants c1 and c2 and n0 such that c1 g(n) <= f(n) <= c2g(n) for all n >= n0
c2g(n) f(n) c1g(n)

n0

This means that the best and worst case requires the same amount of time to within a constant factor.

Intuition for Asymptotic Notation


Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n) big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n) big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)

Relatives of Big-Oh
little-oh ? little-omega ?

Small o Notation
For every c and n if f(n) is always less than g(n) then f(n) belongs to o(g(n)).

Small Omega Notation


For every c and n if f(n) is always above than g(n) then f(n) belongs to small omega(g(n)).

Using Limits for Comparing Order of Growth

You might also like