Professional Documents
Culture Documents
In Asymptotic Analysis, we evaluate the performance of an algorithm in terms of input size (we
dont measure the actual running time). We calculate, how does the time (or space) taken by an
algorithm increases with the input size. One way to search is Linear Search (order of growth is
linear) and other way is Binary Search (order of growth is logarithmic). But, after certain value
of input array size, the Binary Search will definitely start taking less time compared to the Linear
Search even though the Binary Search is being run on a slow machine.
In this post, we will take an example of Linear Search and analyze it using Asymptotic analysis.
For some algorithms, all the cases are asymptotically same, i.e., there are no worst and best
cases. For example, Merge Sort. Merge Sort does (nLogn) operations in all cases.
Big O Notation
The Big O notation defines an upper bound of an algorithm, it bounds a function only from
above. For example, consider the case of Insertion Sort. It takes linear time in best case and
quadratic time in worst case. We can safely say that the time complexity of Insertion sort is
O(n^2). Note that O(n^2) also covers linear time.
// Here c is a constant
for (int i = 1; i <= c; i++) {
// some O(1) expressions
}
2) O(n): Time Complexity of a loop is considered as O(n) if the loop variables is incremented /
decremented by a constant amount. For example following functions have O(n) time
complexity.
3) O(nc): Time complexity of nested loops is equal to the number of times the innermost
statement is executed. For example the following sample loops have O(n2) time complexity
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
4) O(Logn) Time Complexity of a loop is considered as O(Logn) if the loop variables is divided /
multiplied by a constant amount.
// Here c is a constant greater than 1
for (int i = 2; i <=n; i = pow(i, c)) {
// some O(1) expressions
}
How to calculate time complexity when there are many if, else statements inside loops?
We evaluate the situation when values in if-else conditions cause maximum number of
statements to be executed.
Space complexity is a measure of the amount of working storage an algorithm needs. That
means how much memory, in the worst case, is needed at any point in the algorithm. As with
time complexity, we're mostly concerned with how the space needs grow, in big-Oh terms, as
the size N of the input problem grows.
Requires 3 units of space for the parameters and 1 for the local variable, and this never
changes, so this is O(1).
If a function A uses M units of its own space (local variables and parameters), and it calls a
function B that needs N units of local space, then A overall needs M + N units of temporary
workspace. What if A calls B 3 times? When a function finishes, its space can be reused, so if A
calls B 3 times, it still only needs M + N units of workspace.What if A calls itself recursively N
times? Then its space can't be reused because every call is still in progress, so it needs O(N 2)
units of workspace.
Pseudo-polynomial Algorithms
An algorithm whose worst case time complexity depends on numeric value of input (not
number of inputs) is called Pseudo-polynomial algorithm. For example, consider the problem of
counting frequencies of all elements in an array of positive numbers. A pseudo-polynomial time
solution for this is to first find the maximum value, then iterate from 1 to maximum value and
for each value, find its frequency in array. This solution requires time according to maximum
value in input array, therefore pseudo-polynomial.