You are on page 1of 16

12/15/2018 Gradeup Play

Weightage Analysis for Algorithms - Weightage Analysis for Algorithms EXIT


Day 1 | Algorithms Orientation | Introductions

                                                                     Algorithms Analysis 

GATE Computer Science Engineering 2018


Paper Analysis: 7 Marks
GATE Computer Science Engineering 2017
Paper-1 Analysis: 4 Marks
Paper-2 Analysis: 7 Marks
GATE Computer Science Engineering 2016
Paper-1 Analysis: 6 Marks
Paper-2 Analysis: 9 Marks
GATE Computer Science Engineering 2015
Paper-1 Analysis: 7 Marks
Paper-2 Analysis: 6 Marks
Paper-3 Analysis: 7 Marks
GATE Computer Science Engineering 2014
Paper-1 Analysis: 5 Marks
Paper-2 Analysis: 5 Marks
Paper-3 Analysis: 6 Marks

Important Topics

Topics covered in 2018: Heap, Matrix, Greedy Algorithms, graphs.


Topics covered in 2017: Searching, Algorithms matching, Asymptotic analysis, greedy
algorithms
Topics covered in 2016: Sorting Algorithms; Complexity Selections Sort; merge sort
Topics covered in 2015: Algorithm Analysis, Heaps, Sorting Algorithm, Graph Algorithm

We hope this analysis will be useful to prepare Algorithms. You can analyze frequently asked topics in
GATE from the given analysis to crack the GATE 2017 Exam.

NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/9a736e9c-a9f3-11e8-a208-0a99e0cf8f3a 1/1
12/15/2018 Gradeup Play

Important Topics to prepare for Algorithms - Important Topics to prepare for EXIT
Algorithms
Day 1 | Algorithms Orientation | Introductions

Algorithm subject is very important as it covers all the basic concepts that are used in Computer
Science branch. This subject also carries good amount of weightage in GATE exam and all PSU's
exam. Also many questions are asked from this subject in interviews of both public and private
sector. So we are providing a list of important topics in this subject. While preparing this subject , we
advice you to not miss any these topics because they are asked for the maximum time in GATE and
other exams. 

Time Complexity 
Properties of Asymptotic Notations
Comparing functions in terms of Big-Oh
Finding time complexity of given code snippet
Divide and Conquer
Finding minimum/maximum (Recurrence Relation and Time Complexity)
Power of an element (Recurrence Relation and Time Complexity)
Binary Search (Recurrence Relation and Time Complexity)
Merge Sort (Recurrence Relation and Time Complexity)
Quick Sort (Recurrence Relation and Time Complexity of Best case and Worst Case)
Heap Sort (Heap Tree, Heapify function, Recurrence Relation and Time Complexity)
Greedy Algorithms
Huffman coding (Finding the average length of the given characters)
Fractional Knapsack (Finding maximum pro t)
Job sequencing with a deadline (Finding maximum pro t to complete given jobs in given
deadline)
Minimum Cost Spanning Tree (Prim's and kruskal's Algorithm)
Single Source shortest paths (Dijkstra’s  Algorithms)
Bellman Ford Algorithm
Dynamic Programming
Longest Common Sub-sequence (Concept with example)
0/1 knapsack Algorithms (Concept with example)
Sum of subset problems (Concept with example)
Matrix chain Multiplications (Concept with example)
Hashing
Collision Resolving Techniques (Open Addressing and Chaining)
Probing (Linear and Quadratic)

So these are the important topics in algorithm subject. Preparing these topics will cover almost the
entire subject which will help you to score maximum marks in this subject.

NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/5456e81c-a9f3-11e8-a208-0a99e0cf8f3a 1/1
gradeup
Algorithm (Short Notes)

Selection Sort:

Merge Sort:

up
a de
gr

Quick Sort:

gradeup
gradeup
Insertion Sort:

Bubble Sort:

up
a de
Heap Sort:
gr

gradeup
gradeup
Binary tree traversal Algorithms:

Linear Search

up
de
Binary Search
a
gr

gradeup
gradeup
Recursiv binarh search Algorithm:

Asymptotic Notations:

up
Big Oh (O): If we write f(n) = O(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, f(n) ≤ cg
(n) with any constant c and a positive integer n0.

Big Omega (Ω): If we write f(n) = Ω(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, f(n) ≥
cg(n) with any constant c and a positive integer n0.
de
Big Theta (θ): If we write f(n) = θ(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, c1g(n) ≤
f(n) ≤ c2g(n) with a positive integer n0, any positive constants c1 and c2.

Small Oh (o): If we write f(n) = o(g(n), then there exists a function such that f(n) < c g(n) with any
a

positive constant c and a positive integer n0.

Small Omega (ω): If we write f(n) = ω(g(n)), then these exists a function such that f(n) > cg(n) with
gr

any positive constant c and a positive integer n0.

Important Asymptotic relations:

gradeup
gradeup
Worst case Time Complexities for popular data structures:

up
Time Complexities for popular sorting algorithms:
a de
gr

Kruskal’s algorithm:

gradeup
gradeup
Prim’s algorithm:

up
de
Dijkstra’s Algorithm:
a
gr

Dijkstra’s Algorithm Implementation using Heap:

gradeup
gradeup
Bellman Ford Algorithm:

Floyd-warshall algorithm:

up
a de
gr

gradeup
12/15/2018 Gradeup Play

Study Notes on Asymptoic Notations - Study Notes on Asymptoic Notations EXIT


Day 2 | Time Complexity | Asymptoic Notations

Asymptotic Notations:
The notations we use to describe the asymptotic (approximate) running time of an algorithm are
de ned in terms of functions whose domains are the set of natural numbers N = {0, 1, 2 ... }.  The
asymptotic notations consist of the following useful notations.
 
Big Oh (O): 
Note: O(1) refers to constant time. O(n) indicates linear time; O(nk) (k xed) refers to polynomial time;
O(log n) is called logarithmic time; O(2n) refers to exponential time, etc.

If we write f(n) = O(g(n)), then there exists a function f(n) such that ∀ n ≥ n0, f(n) ≤ cg (n) with
any constant c and a positive integer n0. 

          OR

f(n) = O(g(n)) means we can say g(n) is an asymptotic upper bound for f(n).

Example-1:

         Let f(n) = n2 + n + 5. Then f(n) is O(n2), and f(n) is O(n3), but f(n) is not O(n).

Example-2: 

         Let f(n) =  3n . Then f(n) is O(4n) but not f(n) is not O(2n)


O(1) < O(log n) < O(n) < O(n2) < O(2n)

Omega (Ω): 

If we write f(n) = Ω(g(n)), then there exists a function f(n) such that ∀ n ≥ n0,  f(n) ≥ cg(n) with
any constant c and a positive integer n0. Or
f(n) = Ω(g(n)) means we can say Function g(n) is an asymptotic lower bound for f(n).
Example-1: Let f (n) = 2n2 + 4n + 10. Then f (n) is Ω(n2)

Theta (θ): 
NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 1/7
12/15/2018 Gradeup Play

If we write f(n) = θ(g(n)), then there exists a function f(n) such that ∀ n ≥ n0,  c1g(n) ≤ f(n) ≤
c2g(n) with a positive integer n0, any positive constants c1 and c2. Or
f(n) = θ(g(n)) means we can say Function g(n) is an asymptotically tight bound for f(n).

Example-1:

                  Let f (n) = 2n2 + 4n + 10. Then f (n) is θ(n2)

Small Oh (o):  

If we write f(n) = o(g(n), then there exists a function such that f(n) < c g(n) with any positive
constant c and a positive integer n0. Or
f(n) = o(g(n) means we can say Function g(n) is an asymptotically tight upper bound of f(n).
Example:  n1.99 = o(n2)

Small Omega (ω): 

If we write f(n) = ω(g(n)), then these exists a function such that f(n) > cg(n) with any positive
constant c and a positive integer n0. Or
f(n) = ω(g(n)) means we can say g(n) is asymptotically tight lower bound of f(n).
Example:  n2.00001 = ω(n2) and n2 ≠ ω(n2)

II. The relationship between asymptotic notations :


 

NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 2/7
12/15/2018 Gradeup Play

III. Properties of Asymptotic notations

1.Re exive Property:

 
2. Symmetric Property: 

3.Transitive Property

 
 
4.Mixed asymptotic transitive Properties: The complexity of an algorithm is a function describing the
e ciency of the algorithm in terms of the amount of data the algorithm must process. Usually there
are natural units for the domain and range of this function. There are two main complexity measures
of the e ciency of an algorithm:

IV. Analysis of Algorithms NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 3/7
12/15/2018 Gradeup Play

Algorithm can be classi ed by the amount of time they need to complete compared to their
input size.
The analysis of an algorithm focuses on the complexity of algorithm which depends on time or
space.

1. Time Complexity: 

The time complexity is a function that gives the amount of time required by an algorithm to run to
completion.

Worst case time complexity: It is the function de ned by the maximum amount of time needed
by an algorithm for an input of size n.
Average case time complexity: The average-case running time of an algorithm is an estimate of
the running time for an "average" input. Computation of average-case running time entails
knowing all possible input sequences, the probability distribution of occurrence of these
sequences, and the running times for the individual sequences.
Amortized Running Time: It is the time required to perform a sequence of (related) operations is
averaged over all the operations performed.  Amortized analysis guarantees the average
performance of each operation in the worst case.
Best case time complexity: It is the minimum amount of time that an algorithm requires for an
input of size n.  

2. Space Complexity: 

The space complexity is a function that gives the amount of space required by an algorithm to run to
completion.
 

V. What is Recurrence Relations


A recurrence is a function de ned in terms of One or more base cases and Itself with smaller
arguments.
Example:

Above recurrence relation can be computed asymptotically that is T(n) = O(n2).


In algorithm analysis, we usually express both the recurrence and its solution using asymptotic
notation.

Methods to Solve the Recurrence Relation


There are three methods to solve the recurrence relation given as: Master method , Substitution
Method and Recursive Tree method.

1. Master Method: 

The master method gives us a quick way to nd solutions to recurrence relations of the form T(n) = aT
(n/b) + f(n). Where, a and b are constants, a ≥ 1 and b > 1)


T(n) = aT(n/b) + f (n) where f(n)   Θ(nd), d ≥ 0
Case-1: If a < bd,  T(n)   Θ(nd)∈

Case-2: If a = bd,  T(n)   Θ(nd log n)

Case-3: If a > bd,  T(n)   Θ(nlog b a )
Examples:

T(n) = 4T(n/2) + n     T(n)   Θ(n2) ∈ NEXT
⇒ ∈
T(n) = 4T(n/2) + n2    T(n)   Θ(n2log n)
https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 4/7
12/15/2018 Gradeup Play


T(n) = 4T(n/2) + n3    T(n)   Θ(n3) ∈
2. Iterative Substitution Method: 

Recurrence equation is substituted itself to nd the nal generalized form of the recurrence equation.

T(N) = 2T(N/2)+bN

Here T(N/2) is substituted with 2T((N/2)/2)+b(N/2)

T(N) = 4T(N/4)+2bN (Similarly substitute for T(N/4)

T(N) = 8T(N/8)+3bN

After (i-1) substititions,

T(N) = 2iT(N/2i)+ibN

When i=log(N), N/2i=1 and we have the base case.


T(N) = 2log(N)T(N/2log(N))+ibN
T(N) = N T(1) +log(N)bN
T(N) = Nb +log(N)bN
Therefore T(N) is O(Nlog(N))
Using recursion method, n element problem can be further divided into two or more sub problems. In
the following 

3. Recursion Tree Method: NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 5/7
12/15/2018 Gradeup Play

gure, given problem with n elements is divided into two equal sub problems of size n/2. For each
level of the tree the number of elements is N. When the tree is split so evenly the sizes of all the
nodes on each level. Maximum depth of tree is logN (number of levels).
Recurrence relation T(n) = 2 T(n/2) + O(n) =  Θ(Nlog(N)).

 
Example-1: Find the Time complexity for T(N) = 4T(N/2)+N.
Let T(N) = aT(N/b)+f(N).
Then a=4, b=2, and f(N)=N
Nlogba = Nlog24 = N2.
f(N) is smaller than Nlogba
Using case 1 of master theorem with ε=1.
T(N) = Θ(Nlogba) = Θ(N2).
 
Example-2:  Find the Time complexity for  T(N) = 2T(N/2) = Nlog(N)
a=2, b=2, and f(N)=Nlog(N)
Using case 2 of the master theorem with k=1.
T(N) = Θ(N(logN)2).
Example-3:  Find the Time complexity for T(N) = T(N/4) + 2N
a=1, b=4, and f(N)=2N
logba = log41 = 0
Nlogb = N0 = 1
Using case 3: f(N) is Ω(N0+ε) for &epsilon=1 and af(N/b) = (1)2(N/4) = N/2 = (1/4)f(N)
Therefore T(N) = Θ(N)
Example-4:  Find the Time complexity for T(N) = 9T(N/3) + N2.5
a=9, b=3, f(N)=N2.5
logba = 2, so Nlogb = N2
Using case 3 since f(N) is Ω(N0+ε), epsilon=.5 and af(N/b)=9f(N/3)= 9(N/3)2.5=(1/3)0.5f(n).
Therefore T(n) is O(N2.5)

NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 6/7
12/15/2018 Gradeup Play

NEXT

https://courses.gradeup.co/batch/76afc3d0-a53d-11e8-a208-0a99e0cf8f3a/play/7425d8c4-a95d-11e8-a208-0a99e0cf8f3a 7/7

You might also like