Professional Documents
Culture Documents
Algorithms Analysis
Important Topics
We hope this analysis will be useful to prepare Algorithms. You can analyze frequently asked topics in
GATE from the given analysis to crack the GATE 2017 Exam.
NEXT
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/9a736e9c-a9f3-11e8-a208-0a99e0cf8f3a 1/1
12/15/2018 Gradeup Play
Important Topics to prepare for Algorithms - Important Topics to prepare for EXIT
Algorithms
Day 1 | Algorithms Orientation | Introductions
Algorithm subject is very important as it covers all the basic concepts that are used in Computer
Science branch. This subject also carries good amount of weightage in GATE exam and all PSU's
exam. Also many questions are asked from this subject in interviews of both public and private
sector. So we are providing a list of important topics in this subject. While preparing this subject , we
advice you to not miss any these topics because they are asked for the maximum time in GATE and
other exams.
Time Complexity
Properties of Asymptotic Notations
Comparing functions in terms of Big-Oh
Finding time complexity of given code snippet
Divide and Conquer
Finding minimum/maximum (Recurrence Relation and Time Complexity)
Power of an element (Recurrence Relation and Time Complexity)
Binary Search (Recurrence Relation and Time Complexity)
Merge Sort (Recurrence Relation and Time Complexity)
Quick Sort (Recurrence Relation and Time Complexity of Best case and Worst Case)
Heap Sort (Heap Tree, Heapify function, Recurrence Relation and Time Complexity)
Greedy Algorithms
Huffman coding (Finding the average length of the given characters)
Fractional Knapsack (Finding maximum pro t)
Job sequencing with a deadline (Finding maximum pro t to complete given jobs in given
deadline)
Minimum Cost Spanning Tree (Prim's and kruskal's Algorithm)
Single Source shortest paths (Dijkstra’s Algorithms)
Bellman Ford Algorithm
Dynamic Programming
Longest Common Sub-sequence (Concept with example)
0/1 knapsack Algorithms (Concept with example)
Sum of subset problems (Concept with example)
Matrix chain Multiplications (Concept with example)
Hashing
Collision Resolving Techniques (Open Addressing and Chaining)
Probing (Linear and Quadratic)
So these are the important topics in algorithm subject. Preparing these topics will cover almost the
entire subject which will help you to score maximum marks in this subject.
NEXT
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/5456e81c-a9f3-11e8-a208-0a99e0cf8f3a 1/1
gradeup
Algorithm (Short Notes)
Selection Sort:
Merge Sort:
up
a de
gr
Quick Sort:
gradeup
gradeup
Insertion Sort:
Bubble Sort:
up
a de
Heap Sort:
gr
gradeup
gradeup
Binary tree traversal Algorithms:
Linear Search
up
de
Binary Search
a
gr
gradeup
gradeup
Recursiv binarh search Algorithm:
Asymptotic Notations:
up
Big Oh (O): If we write f(n) = O(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, f(n) ≤ cg
(n) with any constant c and a positive integer n0.
Big Omega (Ω): If we write f(n) = Ω(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, f(n) ≥
cg(n) with any constant c and a positive integer n0.
de
Big Theta (θ): If we write f(n) = θ(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, c1g(n) ≤
f(n) ≤ c2g(n) with a positive integer n0, any positive constants c1 and c2.
Small Oh (o): If we write f(n) = o(g(n), then there exists a function such that f(n) < c g(n) with any
a
Small Omega (ω): If we write f(n) = ω(g(n)), then these exists a function such that f(n) > cg(n) with
gr
gradeup
gradeup
Worst case Time Complexities for popular data structures:
up
Time Complexities for popular sorting algorithms:
a de
gr
Kruskal’s algorithm:
gradeup
gradeup
Prim’s algorithm:
up
de
Dijkstra’s Algorithm:
a
gr
gradeup
gradeup
Bellman Ford Algorithm:
Floyd-warshall algorithm:
up
a de
gr
gradeup
12/15/2018 Gradeup Play
Asymptotic Notations:
The notations we use to describe the asymptotic (approximate) running time of an algorithm are
de ned in terms of functions whose domains are the set of natural numbers N = {0, 1, 2 ... }. The
asymptotic notations consist of the following useful notations.
Big Oh (O):
Note: O(1) refers to constant time. O(n) indicates linear time; O(nk) (k xed) refers to polynomial time;
O(log n) is called logarithmic time; O(2n) refers to exponential time, etc.
If we write f(n) = O(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, f(n) ≤ cg (n) with
any constant c and a positive integer n0.
OR
f(n) = O(g(n)) means we can say g(n) is an asymptotic upper bound for f(n).
Example-1:
Example-2:
Omega (Ω):
If we write f(n) = Ω(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, f(n) ≥ cg(n) with
any constant c and a positive integer n0. Or
f(n) = Ω(g(n)) means we can say Function g(n) is an asymptotic lower bound for f(n).
Example-1: Let f (n) = 2n2 + 4n + 10. Then f (n) is Ω(n2)
Theta (θ):
NEXT
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 1/7
12/15/2018 Gradeup Play
If we write f(n) = θ(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, c1g(n) ≤ f(n) ≤
c2g(n) with a positive integer n0, any positive constants c1 and c2. Or
f(n) = θ(g(n)) means we can say Function g(n) is an asymptotically tight bound for f(n).
Example-1:
Small Oh (o):
If we write f(n) = o(g(n), then there exists a function such that f(n) < c g(n) with any positive
constant c and a positive integer n0. Or
f(n) = o(g(n) means we can say Function g(n) is an asymptotically tight upper bound of f(n).
Example: n1.99 = o(n2)
If we write f(n) = ω(g(n)), then these exists a function such that f(n) > cg(n) with any positive
constant c and a positive integer n0. Or
f(n) = ω(g(n)) means we can say g(n) is asymptotically tight lower bound of f(n).
Example: n2.00001 = ω(n2) and n2 ≠ ω(n2)
NEXT
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 2/7
12/15/2018 Gradeup Play
2. Symmetric Property:
3.Transitive Property
4.Mixed asymptotic transitive Properties: The complexity of an algorithm is a function describing the
e ciency of the algorithm in terms of the amount of data the algorithm must process. Usually there
are natural units for the domain and range of this function. There are two main complexity measures
of the e ciency of an algorithm:
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 3/7
12/15/2018 Gradeup Play
Algorithm can be classi ed by the amount of time they need to complete compared to their
input size.
The analysis of an algorithm focuses on the complexity of algorithm which depends on time or
space.
1. Time Complexity:
The time complexity is a function that gives the amount of time required by an algorithm to run to
completion.
Worst case time complexity: It is the function de ned by the maximum amount of time needed
by an algorithm for an input of size n.
Average case time complexity: The average-case running time of an algorithm is an estimate of
the running time for an "average" input. Computation of average-case running time entails
knowing all possible input sequences, the probability distribution of occurrence of these
sequences, and the running times for the individual sequences.
Amortized Running Time: It is the time required to perform a sequence of (related) operations is
averaged over all the operations performed. Amortized analysis guarantees the average
performance of each operation in the worst case.
Best case time complexity: It is the minimum amount of time that an algorithm requires for an
input of size n.
2. Space Complexity:
The space complexity is a function that gives the amount of space required by an algorithm to run to
completion.
1. Master Method:
The master method gives us a quick way to nd solutions to recurrence relations of the form T(n) = aT
(n/b) + f(n). Where, a and b are constants, a ≥ 1 and b > 1)
∈
T(n) = aT(n/b) + f (n) where f(n) Θ(nd), d ≥ 0
Case-1: If a < bd, T(n) Θ(nd)∈
∈
Case-2: If a = bd, T(n) Θ(nd log n)
∈
Case-3: If a > bd, T(n) Θ(nlog b a )
Examples:
⇒
T(n) = 4T(n/2) + n T(n) Θ(n2) ∈ NEXT
⇒ ∈
T(n) = 4T(n/2) + n2 T(n) Θ(n2log n)
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 4/7
12/15/2018 Gradeup Play
⇒
T(n) = 4T(n/2) + n3 T(n) Θ(n3) ∈
2. Iterative Substitution Method:
Recurrence equation is substituted itself to nd the nal generalized form of the recurrence equation.
T(N) = 2T(N/2)+bN
T(N) = 8T(N/8)+3bN
T(N) = 2iT(N/2i)+ibN
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 5/7
12/15/2018 Gradeup Play
gure, given problem with n elements is divided into two equal sub problems of size n/2. For each
level of the tree the number of elements is N. When the tree is split so evenly the sizes of all the
nodes on each level. Maximum depth of tree is logN (number of levels).
Recurrence relation T(n) = 2 T(n/2) + O(n) = Θ(Nlog(N)).
Example-1: Find the Time complexity for T(N) = 4T(N/2)+N.
Let T(N) = aT(N/b)+f(N).
Then a=4, b=2, and f(N)=N
Nlogba = Nlog24 = N2.
f(N) is smaller than Nlogba
Using case 1 of master theorem with ε=1.
T(N) = Θ(Nlogba) = Θ(N2).
Example-2: Find the Time complexity for T(N) = 2T(N/2) = Nlog(N)
a=2, b=2, and f(N)=Nlog(N)
Using case 2 of the master theorem with k=1.
T(N) = Θ(N(logN)2).
Example-3: Find the Time complexity for T(N) = T(N/4) + 2N
a=1, b=4, and f(N)=2N
logba = log41 = 0
Nlogb = N0 = 1
Using case 3: f(N) is Ω(N0+ε) for &epsilon=1 and af(N/b) = (1)2(N/4) = N/2 = (1/4)f(N)
Therefore T(N) = Θ(N)
Example-4: Find the Time complexity for T(N) = 9T(N/3) + N2.5
a=9, b=3, f(N)=N2.5
logba = 2, so Nlogb = N2
Using case 3 since f(N) is Ω(N0+ε), epsilon=.5 and af(N/b)=9f(N/3)= 9(N/3)2.5=(1/3)0.5f(n).
Therefore T(n) is O(N2.5)
NEXT
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 6/7
12/15/2018 Gradeup Play
NEXT
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 7/7