You are on page 1of 22

THE INDIAN ENGINEERING COLLEGE

Vadakkangulam
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
TREC / ME DEGREE COMPUTER SCIENCE AND ENGINEERING
CS91! DATA STRUCTURES AND ALGORITHMS
S"#$% &u'(%)#n( and An(*'$( + Ma$k( ,
P$'-a$'d ./0

ANON 12 3ENIFER0 L'4%u$'$/CSE
UNIT I COMPLE5IT6 ANAL6SIS 7 ELEMENTAR6 DATA STRUCTURES
12 8"a% d# a(/m-%#%)4 n#%a%)#n m'an(9
Asymptotic notations are terminology that is introduced to enable us to make
meaningful statements about the time and space complexity of an algorithm. The
different notations are
Big Oh notation
Omega notation
Theta notation.
2 8"a% a$' %"' :a()4 a(/m-%#%)4 ';;)4)'n4/ 4la(('(9
The various basic efficiency classes are
Constant :
!ogarithmic : log n
!ogarithmic : log n
!inear : n
"#log#n : nlog n
$uadratic : n
%
Cubic : n
&
'xponential : %
n
(actorial : n)
<2 D';)n' O!n#%a%)#n9
A function t*n+ is said to be in O*g*n++, denoted by t*n+
O*g*n++, if t*n+ is bounded above by some constant multiple of g*n+ for all large n, i.e., if
there exists some positive constant c and some nonnegative integer
n- such that T*n+ . cg*n+ for all n / n-
=2 8"a% a$' '>-#n'n%)al g$#*%" ;un4%)#n(9
The functions %
n
and n) are exponential gro0th functions, because these t0o
functions gro0 so fast that their values become astronomically large even for rather
smaller values of n.
?2 8"a% )( *#$(%!4a(' ';;)4)'n4/9
The 0orst#case efficiency of an algorithm is its efficiency for the 0orst#case input
of si1e n, 0hich is an input or inputs of si1e n for 0hich the algorithm runs the longest
among all possible inputs of that si1e.

@2 8"a% )( :'(%!4a(' ';;)4)'n4/9
The best#case efficiency of an algorithm is its efficiency for the best#case input of
si1e n, 0hich is an input or inputs for 0hich the algorithm runs the fastest among all
possible inputs of that si1e.
A2 8"a% )( aB'$ag' 4a(' ';;)4)'n4/9
The average case efficiency of an algorithm is its efficiency for an average case
input of si1e n. 2t provides information about an algorithm behavior on a 3typical. or
3random. input.
C2 8"a% )( am#$%)D'd ';;)4)'n4/9
2n some situations a single operation can be expensive, but the total time for the
entire se4uence of n such operations is al0ays significantly better that the 0orst case
efficiency of that single operation multiplied by n. this is called amorti1ed efficiency.
92 8"a% )( %"' ;#$mula u('d )n Eu4l)dE( alg#$)%"m ;#$ ;)nd)ng %"' g$'a%'(% 4#mm#n
d)B)(#$ #; %*# num:'$(9
'uclid5s algorithm is based on repeatedly applying the e4uality
6cd*m,n+7gcd*n,m mod n+ until m mod n is e4ual to -, since gcd*m,-+7m.
1F2 D';)n' a$%)4ula%)#n -#)n%(2
2f a graph is not biconnected,the vertices 0hose removal 0ould disconnect the
graph are kno0n as articulation points.
112 G)B' (#m' '>am-l' #; NP 4#m-l'%' -$#:l'm(2
i. 8amiltonian circuit.
ii. Travelling salesmen problems
iii. !ongest path problems
iv. Bin packing
v. 9napsack problem
vi. 6raph colouring problem
12 8"a% )( %"' $'4u$$'n4' $'la%)#n %# ;)nd #u% %"' num:'$ #; mul%)-l)4a%)#n( and %"'
)n)%)al 4#nd)%)#n ;#$ ;)nd)ng %"' n!%" ;a4%#$)al num:'$9
The recurrence relation and initial condition for the number of multiplications is

:*n+7:*n#+; for n<- :*-+7-
1<2 8"a% )( %"' :a()4 #-'$a%)#n )n %"' T#*'$ #; Han#) -$#:l'm and g)B' %"'
$'4u$$'n4' $'la%)#n ;#$ %"' num:'$ #; m#B'(9
The moving of disks is considered the basic operation in the To0er of 8anoi
problem and the recurrence relation for the number of moves is given as :*n+7%:*n+;
for n< :*+7
1=2 8"# )n%$#du4'd %"' F):#na44) num:'$( and "#* 4an )% :' d';)n'd :/ a ()m-l'
$'4u$$'n4'9
!eonardo (ibonacci introduced the fibonacci numbers in %-% as a solution to a
problem about the si1e of rabbit population. 2t can be defined by the simple recurrence
(*n+7(*n#+;(*n#%+ for n< And t0o initial conditions (*-+7- and (*+7
1?2 8"a% )( %"' '>-l)4)% ;#$mula ;#$ %"' n%" F):#na44) num:'$9
The formula for the nth (ibonacci number is given by
(*n+7 =>*?n # ?n+
@here
? 7*;>+=%
? 7*#>+=%
1@2 D';)n' Da%a S%$u4%u$'(
Aata Btructures is defined as the 0ay of organi1ing all data items that consider not
only the elements stored but also stores the relationship bet0een the elements.
1A2 D';)n' L)nk'd L)(%(
!inked list consists of a series of structures, 0hich are not necessarily adCacent in
memory. 'ach structure contains the element and a pointer to a structure containing its
successor. @e call this the"ext Dointer. The last cell5s"ext pointer points to "E!!.
1C2 S%a%' %"' d);;'$'n% %/-'( #; l)nk'd l)(%(
The different types of linked list include singly linked list, doubly linked list and
circular linked list.
. !ist the basic operations carried out in a linked list
The basic operations carried out in a linked list include:
/ Creation of a list
/ 2nsertion of a node
/ Aeletion of a node
/ :odification of a node
/ Traversal of the list
192 L)(% #u% %"' adBan%ag'( #; u()ng a l)nk'd l)(%
/ 2t is not necessary to specify the number of elements in a linked list during its
declaration
/ !inked list can gro0 and shrink in si1e depending upon the insertion and deletion
that occurs in the list
/ 2nsertions and deletions at any place in a list can be handled easily and efficiently
/ A linked list does not 0aste any memory space
F2 L)(% #u% %"' a--l)4a%)#n( #; a l)nk'd l)(%
Bome of the important applications of linked lists are manipulation of
polynomials, sparse matrices, stacks and 4ueues.
12 S%a%' %"' d);;'$'n4' :'%*''n a$$a/( and l)nk'd l)(%(
Arrays !inked !ists
Bi1e of an array is fixed Bi1e of a list is variable
2t is necessary to specify the
number of elements during
declaration.
2t is not necessary to specify the
number of elements during
declaration
2nsertions and deletions are
some0hat difficult
2nsertions and deletions are carried
out easily
2t occupies less memory than a
linked list for the same number of
elements
2t occupies more memory
2 8"a% d# /#u m'an :/ :alan4'd %$''(9
Balanced trees have the structure of binary trees and obey binary search tree
properties. Apart from these properties, they have some special constraints, 0hich differ
from one data structure to another. 8o0ever, these constraints are aimed only at reducing
the height of the tree, because this factor determines the time complexity.
'g: AF! trees, Bplay trees.

<2 8"a% )( %"' u(' #; %"$'ad'd :)na$/ %$''9
2n threaded binary tree, the "E!! pointers are replaced by some addresses. The
left pointer of the node points to its predecessor and the right pointer of the node points to
its successor.

=2C#n(%$u4%)#n #; '>-$'(()#n %$''(9
.convert the given infix expression into postfix notation
%. Create a stack and read each character of the expression and push into the stack, if
operands are encountered.
&.0hen an operator is encountered pop % values from the stack. (rom a tree using the
operator.

?2 8"/ /#u n''d a da%a (%$u4%u$'9
A data structure helps you to understand the relationship of one data element 0ith
the other and organi1e it 0ithin the memory. Bometimes the organi1ation might be simple
and can be very clearly visioned. 'g+ !ist of names of months in a year !inear Aata
Btructure, !ist of historical places in the 0orld# "on#!inear Aata Btructure. A data
structure helps you to analy1e the data, store it and organi1e it in a logical and
mathematical manner.
UNIT II HEAP STRUCTURES
12 D';)n' H'a-2
A heap is a partially ordered data structure, and can be defined as a binary tree
assigned to its nodes, one key per node, provided the follo0ing t0o conditions are met
i.The tree5s shape re4uirement#The binary tree is essentially complete, that is all
the leaves are full except possibly the last level, 0here only some rightmost leaves 0ill be
missing.
ii.The parental dominance re4uirement#The key at each node is greater that or
e4ual to the keys of its children
2 8"a% a$' %"' -$#-'$%)'( #; :)na$/ "'a-9
i+ Btructure Droperty
ii+ 8eap Order Droperty
<2 8"a% )( a m)n!"'a-9
A min#heap is a mirror image of the heap structure. 2t is a complete binary tree in
0hich every element is less than or e4ual to its children. Bo the root of the min#heap
contains the smallest element.
=2 *"a% )( ma>"'a-9
2f 0e 0ant the elements in the more typical increasing sorted order,0e can
change the ordering property so that the parent has a larger key than the child.it is called
max heap
?2 8"a% d# /#u m'an :/ (%$u4%u$' -$#-'$%/ )n a "'a-9
A heap is a binary tree that is completely filled 0ith the possible exception at the
bottom level, 0hich is filled from left to right. Buch a tree is kno0n as a complete binary
tree.
@2 8"a% d# /#u m'an :/ "'a- #$d'$ -$#-'$%/9
2n a heap, for every node G, the key in the parent of G is smaller than *or
e4ual to+ the key in G, 0ith the exception of the root *0hich has no parent+.
A2 8"a% a$' %"' a--l)4a%)#n( #; -$)#$)%/ Gu'u'(9
/ The selection problem
/ 'vent simulation
C2 8"a% )( %"' ma)n u(' #; "'a-9
8eaps are especially suitable for implementing priority 4ueues. Driority 4ueue is a
set of items 0ith orderable characteristic called an item5s priority, 0ith the follo0ing
operations
i. (inding an item 0ith the highest priority
ii.Aeleting an item 0ith highest priority
iii.Adding a ne0 item to the set

92 G)B' %"$'' -$#-'$%)'( #; "'a-(9
The properties of heap are
There exists exactly one essentially complete binary tree 0ith Hn5 nodes.
2ts height is e4ual to log%n
The root of the heap is al0ays the largest element
A node of a heap considered 0ith all its descendants is also a heap
1F2 G)B' %"' ma)n -$#-'$%/ #; a "'a- %"a% )( )m-l'm'n%'d a( an a$$a/2
A heap can be implemented as an array by recording its elements in the top#do0n,
left#to#right fashion. 2t is convenient to store the heap5s elements in positions through n
of such an array. 2n such a representation
i. The parental node keys 0ill be in the first n=% positions of the array, 0hile the
leaf keys 0ill occupy the last n=% positions
ii.The children of a key in the array5s parental position Hi5 *IiIn=%+ 0ill be in
positions %i and %i;and correspondingly, the parent of the key in position Hi5*%IiIn+ 0ill
be in position i=%.
112 8"a% a$' %"' %*# al%'$na%)B'( %"a% a$' u('d %# 4#n(%$u4% a "'a-9
The t0o alternatives to construct a heap are
Bottom#up heap construction
Top#do0n heap construction
12 8"a% )( %"' alg#$)%"m %# d'l'%' %"' $##%E( k'/ ;$#m %"' "'a-9
A!6OJ2T8:
i.'xchange the root5s key 0ith the last key 9 of the heap
ii.Aecrease the heap5s si1e by one
iii.38eapify. the smaller tree by sifting 9 do0n the tree exactly in the same 0ay
as bottom#up heap construction. Ferify the parental dominance for 9: if it holds stop the
process, if not s0ap 9 0ith the larger of its children and repeat this operation until the
parental dominance holds for 9 in its ne0 position.
1<2 8"a% d# /#u m'an :/ %"' %'$m HP'$4#la%' u-I9
To insert an element, 0e have to create a hole in the next available heap location.
2nserting an element in the hole 0ould sometimes violate the heap order property, so 0e
have to slide do0n the parent into the hole. This strategy is continued until the correct
location for the ne0 element is found. This general strategy is kno0n as a percolate upK
the ne0 element is percolated up the heap until the correct location is found.
1=2 8"a% d# /#u m'an :/ %"' %'$m HP'$4#la%' d#*nI9
@hen the minimum element is removed, a hole is created at the root. Bince the
heap no0 becomes one smaller, it follo0s that the last element G in the heap must move
some0here in the heap. 2f G can be placed in the hole, then 0e are done.. This is
unlikely, so 0e slide the smaller of the hole5s children into the hole, thus pushing the hole
do0n one level. @e repeat this step until G can be placed in the hole. Thus, our action is
to place G in its correct spot along a path from the root containing minimum children.
This general strategy is kno0n as percolate do0n.
1?2 8"a% )( m'an% :/ Im-l)4)% H'a-(9
A particularly simple and beautiful implementation of the heap structure is the
implicit heap. Aata is simply put into an array and the childrens of
element xLiM are defined to be the elements xL%iM and xL%i;M. Thus the parent of the
element c can be found at c=% *integer division+.
1@28"a% )( a (k'* "'a-9
A heap data structure that is stored in a binary tree *not necessarily complete and
balanced+. The insertion and deletion of elements in the tree come from merging t0o
ske0 heaps together in such a 0ay that the heap property is preserved and the tree does
not degrade to a linear tree.
1A2 H#* %# m'$g)ng (k'* "'a-(9
Buppose 0e are merging a heap containing the elements %, >, and N 0ith a heap
containing the elements O and P.
N Q# merge #< P
= R =
> % O
i. 2dentify the root, thus N becomes the ne0 root and the left
subtree of the heap 0ith root N becomes the right subtree of
the other heap:
N % Q# merge #< P *to form the left subtree
R = of the ne0 ske0 heap+
> O
ii.
N
= R
P >
R
O Q# merge #< %
iii.
N
= R
P >
= R
% O
1C2 D';)n' Fa:)na44) H'a-2
The (ibonacci heap is a data structure that supports all the basic heap operations
in O*+ amorti1ed time, 0ith the exception of deleteSmin and delete, 0hich take O *log n+
amorti1ed time. 2t immediately follo0s that the heap operations in AiCkstraTs algorithm
0ill re4uire a total of O*U'U ; UFU log UFU+ time.
192 8"a% )( LaD/ m'$g)ng9
T0o heaps are merged only 0hen it is re4uired to do so. This is similar to la1y
deletion. (or la1y merging, merges are cheap, but because la1y merging does not actually
combine trees, the deleteSmin operation could encounter lots of trees, making that
operation expensive. Any one deleteSmin could take linear time, but it is al0ays possible
to charge the time to previous merge operations. 2n particular, an expensive deleteSmin
must have been preceded by a large number of unduly cheap merges, 0hich have been
able to store up extra potential.
F2 8$)%' %"' am#$%)D'd anal/()( #; LaD/ .)n#m)al &u'u'(
To carry out the amorti1ed analysis of la1y binomial 4ueues, 0e 0ill use the same
potential function that 0as used for standard binomial 4ueues. Thus, the potential of a
la1y binomial 4ueue is the number of trees.
12 .)n#m)al "'a-(J
A set of binomial trees satisfying the follo0ing:
. 'ach binomial tree in 8 is heap#ordered:
the key of a node is greater than or e4ual to the key of its parent
%. There is at most one binomial tree in 8 0hose root has a given degree
28$)%' %"' D)Kk(%$aE( ("#$%'(% -a%" alg#$)%"m
!et 6 7 *F,'+ be a 0eighted *0eights are non#negative+ undirected graph, let s
F. @ant to find the distance *length of the shortest path+, d*s,v+ from s to every other
vertex.
UNIT III SEARCH STRUCTERS
12 8"a% )( :)na$/ ('a$4"9
Binary search is a remarkably efficient algorithm for searching in a sorted array. 2t
0orks by comparing a search key 9 0ith the arrays middle element ALmM. if they match
the algorithm stopsK other0ise the same operation is repeated recursively for the first half
of the array if 9 Q ALmM and the second half if 9 < ALmM.

9
AL-MVVVALm#M ALmM ALm;MVVVALn#M
search here if 9QALmM search here if 9<ALmM
2 8"a% )( a :)na$/ %$'' '>%'n()#n and *"a% )( )%( u('9
The binary tree extension can be dra0n by replacing the empty subtrees by
special nodes in a binary tree. The extra nodes sho0n as little s4uares are called external
W the original nodes sho0n as little circles called internal. The extension of a empty
binary tree is a single external node. The binary tree extension helps in analysis of tree
algorithms.
<2 8"a% a$' %"' 4la(()4 %$aB'$(al( #; a :)na$/ %$''9
The classic traversals are as follo0s
i. Dreorder traversal: the root is visited before left W right subtrees
ii. 2norder traversal: the root is visited after visiting left subtree and before visiting
right subtree
iii. Dostorder traversal: the root is visited after visiting the left and right subtrees
=2 M'n%)#n an alg#$)%"m %# ;)nd #u% %"' "')g"% #; a :)na$/ %$''2
A!6OJ2T8:
8eight*T+ ==Compares recursively the height of a binary tree
==2nput: A binary tree T ==Output: The height of T
if T 7 ?
return
else
return maxX8eight*T!+,
8eight*TJ+Y;
?2 8"a% a$' :)na$/ ('a$4" %$''( and *"a% )( )% ma)nl/ u('d ;#$9
Binary search trees is one of the principal data structures for implementing
dictionaries. 2t is a binary tree 0hose nodes contain elements of a set of orderable items,
one element per node, so that all elements in the left subtree are smaller than the element
in the subtree5s root and all elements in the right subtree are greater than it.
@2 D';)n' AVL %$''( and *"# *a( )% )nB'n%'d :/9
An AF! tree is a binary search tree in 0hich the balance factor of every node,
0hich is defined as the difference bet0een the heights of the node5s left and right
subtrees, is either - or ; or . the height of an empty subtree is defined as . AF!
trees 0ere invented in ZP% by t0o Jussian scientists, 6.:.Adelson#Felsky and
'.:.!andis, after 0hom the data struture is named.
A2 D';)n' AVL T$''2
An empty tree is height balanced. 2f T is a non#empty binary tree 0ith T! and
TJ as its left and right subtrees, then T is height balanced if
i+ T! and TJ are height balanced and
ii+ [h! # hJ[I
@here h! and hJ are the heights of T! and TJ respectively.
C2 8"a% a$' %"' Ba$)#u( %$an(;#$ma%)#n -'$;#$m'd )n AVL %$''9
.single rotation # single ! rotation # single J rotation
%.double rotation #!J rotation #J! rotation
92 8"a% a$' %"' 4a%'g#$)'( #; AVL $#%a%)#n(9
!et A be the nearest ancestor of the ne0ly inserted nod 0hich has the balancing
factor \%. Then the rotations can be classified into the follo0ing four categories:
!eft#!eft: The ne0ly inserted node is in the left subtree of the left child of A.
Jight#Jight: The ne0ly inserted node is in the right subtree of the right child of A.
!eft#Jight: The ne0ly inserted node is in the right subtree of the left child of A.
Jight#!eft: The ne0ly inserted node is in the left subtree of the right child of A.
1F2 8"a% d# /#u m'an :/ :alan4' ;a4%#$ #; a n#d' )n AVL %$''9
The height of left subtree minus height of right subtree is called balance factor of a
node in AF! tree.The balance factor may be either - or ; or #.The height of an empty
tree is #.
112 8$)%' a:#u% %"' ';;)4)'n4/ #; AVL %$''(9
As 0ith any search tree , the critical characteristic is the tree5s height. The tree5s
height is bounded above and belo0 by logarithmic functions. The height Hh5 of any AF!
tree 0ith Hn5 nodes satisfies the ine4ualities
log% n I h Q .OO-> log%*n;%+ .&%NN
The ine4ualities imply that the operations of searching and insertion are ]*log n+ in
the 0orst case. The operation of key deletion in an AF! tree is more difficult than
insertion, but it turns out to have the same efficiency class as insertion i.e., logarithmic
12 8"a% )( %"' m)n)mum num:'$ #; n#d'( )n an AVL %$'' #; "')g"% "9
The minimum number of nodes B*h+, in an AF! tree of height h is given
by B*h+7B*h#+;B*h#%+;. (or h7-, B*h+7.
1<2 8"a% a$' !< %$''( and *"# )nB'n%'d %"'m9
A %#& tree is a tree that can have nodes of t0o kinds:%#nodes and &#nodes. A %#
node contains a single key 9 and has t0o children, the left child serves as the root of a
subtree 0hose keys are less than 9 and the right child serves as the root of a subtree 0ith
keys greater than 9. A &#node contains t0o ordered keys 9 W 9% *9Q9%+. The
leftmost child serves as the root of a subtree 0ith keys less than 9, the middle child
serves as the root of a subtree 0ith keys bet0een 9 W 9% and the rightmost child serves
as the root of a subtree 0ith keys greater than 9%. The last re4uirement of %#& trees is that
all its leaves must be on the same level, a %#& tree is al0ays height balanced. %#& trees
0ere introduced by ^ohn 8opcroft in ZN-.
1=2 8"a% d# /#u m'an :/ !<!= %$''9
A B#tree of order O is called %#&#O tree. A B#tree of order O is a tree that is not
binary 0ith the follo0ing structural properties:
/ The root is either a leaf or has bet0een % and O children.
/ All non#leaf nodes *except the root+ have bet0een % and O children.
/ All leaves are at the same depth.
1=2 D';)n' .!%$'' #; #$d'$ M2
A B#tree of order : is a tree that is not binary 0ith the follo0ing structural
properties:
/ The root is either a leaf or has bet0een % and : children.
/ All non#leaf nodes *except the root+ have bet0een _:=%` and : children.
/ All leaves are at the same depth.
1?2 8"a% a$' %"' a--l)4a%)#n( #; .!%$''9
/ Aatabase implementation
/ 2ndexing on non primary key fields
1@2 D';)n)%)#n #; a $'d!:la4k %$''
A red#black tree is a binary search tree 0hich has the follo0ing red-black properties:
. 'very node is either red or black.
%. 'very leaf *"E!!+ is black.
&. 2f a node is red, then both its children are
black.
O. 'very simple path from a node to a
descendant leaf contains the same number of
black nodes.
A basic red#black tree
Basic red#black tree
0ith the ('n%)n'l nodes
added.
2mplementations of the
red#black tree
algorithms 0ill usually
include the sentinel
nodes as a convenient
means of flagging that
you have reached a leaf
node.
They are the "E!!
black nodes of property
%.
1A2 D';)n' (-la/ %$''2
A splay tree is a binary search tree in 0hich restructuring is done using a scheme
called splay. The splay is a heuristic method 0hich moves a given vertex v to the root of
the splay tree using a se4uence of rotations.
1C2 8"a% )( %"' )d'a :'")nd (-la/)ng9
Bplaying reduces the total accessing time if the most fre4uently accessed node is
moved to0ards the root. 2t does not re4uire to maintain any information regarding the
height or balance factor and hence saves space and simplifies the code to some extent.
192 L)(% %"' %/-'( #; $#%a%)#n( aBa)la:l' )n S-la/ %$''2
!et us assume that the splay is performed at vertex v, 0hose parent and
grandparent are p and g respectively. Then, the three rotations are named as:
i. aig: 2f p is the root and v is the left child of p, then left#left rotation at p 0ould
suffice. This case al0ays terminates the splay as v reaches the root after this
rotation.
ii. aig#aig: 2f p is not the root, p is the left child and v is also a left child, then a left#
left rotation at g follo0ed by a left#left rotation at p, brings v as an ancestor of g
as 0ell as p.
iii. aig#aag: 2f p is not the root, p is the left child and v is a right child, perform a
left#right rotation at g and bring v as an ancestor of p as 0ell as g.
F2 D';)n' :$u%' ;#$4' (%$)ng ma%4")ng2
The brute force string matching has a given string of n characters called the text
and a string of m characters called the pattern, find a substring of the text that matches the
pattern. And find the index 2 of the leftmost character of the first matching substring in
the text.
12 8"a% a$' %"' adBan%ag'( #; :$u%' ;#$4' %'4"n)Gu'9
The various advantages of brute force techni4ue are
i. Brute force applicable to a very 0ide variety of problems. 2t is used for many
elementary but important algorithmic tasks
ii.(or some important problems this approach yields reasonable algorithms of at
least some practical value 0ith no limitation on instance si1e
iii.The expense to design a more efficient algorithm may be unCustifiable if only a
fe0 instances of problems need to be solved and a brute force algorithm can solve
those instances 0ith acceptable speed
iv. 'ven if inefficient in general it can still be used for solving small#si1e
instances of a problem
v. 2t can serve as a yardstick 0ith 0hich to Cudge more efficient alternatives for
solving a problem
2 8"a% a$' %"' -$#-'$%)'( #; :)na$/ "'a-9
i+ Btructure Droperty
ii+ 8eap Order Droperty
UNIT IV GREED6 7 DIVIDE AND CON&UER
12 G)B' %"' g'n'$al -lan ;#$ d)B)d'!and!4#nGu'$ alg#$)%"m(2
The general plan is as follo0s
i.A problems instance is divided into several smaller instances of the same problem,
ideally about the same si1e
ii.The smaller instances are solved, typically recursively
iii.2f necessary the solutions obtained are combined to get the solution of the original
problem
2 S%a%' %"' Ma(%'$ %"'#$'m and )%( u('2
2f f*n+ b]*nd+ 0here d c - in recurrence e4uation T*n+ 7 aT*n=b+;f*n+, then
]*nd+ if aQb
d
T*n+
]*ndlog n+ if a7b
d
]*nlogba+ if a<b
d
The efficiency analysis of many divide#and#con4uer algorithms are greatly simplified by
the use of :aster theorem.
<2 8"a% )( %"' g'n'$al d)B)d'!and!4#nGu'$ $'4u$$'n4' $'la%)#n9
An instance of si1e Hn5 can be divided into several instances of si1e n=b, 0ith Ha5
of them needing to be solved. Assuming that si1e Hn5 is a po0er of Hb5, to simplify the
analysis, the follo0ing recurrence for the running time is obtained: T*n+ 7 aT*n=b+;f*n+
@here f*n+ is a function that accounts for the time spent on dividing the problem into
smaller ones and on combining their solutions.
=2 8"a% )( d'4$'a(' and 4#nGu'$ a--$#a4" and m'n%)#n )%( Ba$)a%)#n(9
The decrease and con4uer techni4ue based on exploiting the relationship bet0een
a solution to a given instance of a problem and a
solution to a smaller instance of the same problem. The three maCor variations are
Aecrease by a constant
Aecrease by a constant#factor
Fariable si1e decrease
?2 8"a% )( a %$'' 'dg' and :a4k 'dg'9
2n the depth first search forest, 0henever a ne0 unvisited vertex is reached for
the first time, it is attached as a child to the vertex from 0hich it is being reached. Buch
an edge is called tree edge because the set of all such edges forms a forest. The algorithm
encounters an edge leading to a previously visited vertex other than its immediate
predecessor. Buch an edge is called a back edge because it connects a vertex to its
ancestor, other than the parent, in the depth first search forest.
@2 8"a% )( a %$'' 'dg' and 4$#(( 'dg'9
2n the breadth first search forest, 0henever a ne0 unvisited vertex is reached for
the first time, it is attached as a child to the vertex from 0hich it is being reached. Buch
an edge is called tree edge. 2f an edge is leading to a previously visited vertex other than
its immediate predecessor, that edge is noted as cross edge.
A2 8"a% )( %$an(;#$m and 4#nGu'$ %'4"n)Gu'9
The group of design techni4ues that are based on the idea of transformation is
called transform and con4uer techni4ue because themethods 0ork as t0o stage
procedures. (irst in the transformation stage, the
problem5s instance is modified to be more amenable *agreeable+ to the solution.
Then in the second or con4uering stage, it is solved.
C2 8"a% )( g$''d/ %'4"n)Gu'9
6reedy techni4ue suggests a greedy grab of the best alternative available in the
hope that a se4uence of locally optimal choices 0ill yield a globally optimal solution to
the entire problem. The choice must be made as follo0s
i.(easible : 2t has to satisfy the problem5s constraints
ii.!ocally optimal : 2t has to be the best local choice among all feasible choices
available on that step.
iii.2rrevocable : Once made, it cannot be changed on a subse4uent step of the
algorithm
92 8"a% )( a (%a%' (-a4' %$''9
The processing of backtracking is implemented by constructing a tree of choices
being made. This is called the state#space tree. 2ts root represents a initial state before the
search for a solution begins. The nodes of the first level in the tree represent the choices
made for the first component of the solution, the nodes in the second level represent the
choices for the second component and so on.
1F2 8"a% )( a -$#m)()ng n#d' )n %"' (%a%'!(-a4' %$''9
A node in a state#space tree is said to be promising if it corresponds to a partially
constructed solution that may still lead to a complete solution.
112 8"a% )( a n#n!-$#m)()ng n#d' )n %"' (%a%'!(-a4' %$''9
A node in a state#space tree is said to be promising if it corresponds to a partially
constructed solution that may still lead to a complete solutionK other0ise it is called non#
promising.
12 8"a% d# l'aB'( )n %"' (%a%' (-a4' %$'' $'-$'('n%9
!eaves in the state#space tree represent either non#promising dead ends or
complete solutions found by the algorithm.
1<2 8"a% )( %"' mann'$ )n *")4" %"' (%a%'!(-a4' %$'' ;#$ a :a4k%$a4k)ng alg#$)%"m )(
4#n(%$u4%'d9
2n the maCority of cases, a state#space tree for backtracking algorithm is
constructed in the manner of depth#first search. 2f the current node is promising, its child
is generated by adding the first remaining legitimate option for the next component of a
solution, and the processing moves to this child. 2f the current node turns out to be non#
promising, the algorithm backtracks to the node5s parent to consider the next possible
solution to the problem, it either stops or backtracks to continue searching for other
possible solutions.
1=2 8"a% )( a ;'a():l' (#lu%)#n and *"a% )( an #-%)mal (#lu%)#n9
2n optimi1ation problems, a feasible solution is a point in the problem5s search
space that satisfies all the problem5s constraints, 0hile an optimal solution is a feasible
solution 0ith the best value of the obCective function.
1?2 D';)n' D)B)d' and C#nGu'$ alg#$)%"m9
Aivide and Con4uer algorithm is based on dividing the problem to be solved into
several, smaller sub instances, solving them independently and then combining the sub
instances solutions so as to yield a solution for the original instance.
1@2 M'n%)#n (#m' a--l)4a%)#n #; D)B)d' and C#nGu'$ alg#$)%"m9
a. $uick Bort b. :erge Bort c. Binary search
1A2 8"a% a$' %"' %*# (%ag'( ;#$ "'a- (#$%9
Btage : Construction of heap Btage % : Joot deletion "# times
1C2 8"a% )( d)B)d' and 4#nGu'$ (%$a%'g/ 9
2n divide and con4uer strategy the given problem is divided into smaller
Droblems and solved recursively. The con4uering phase consists of patching together the
ans0ers . Aivide and con4uer is a very po0erful use of recursion that 0e 0ill see
many times.
192 8"a% d# /#u m'an :/ ('-a$a%' 4"a)n)ng9
Beparate chaining is a collision resolution techni4ue to keep the list of all elements
that hash to the same value. This is called separate chaining because each hash table
element is a separate chain *linked list+. 'ach linked list contains all the elements 0hose
keys hash to the same index.
F2 8$)%' %"' adBan%ag' and D)(adBan%ag'( #; ('-a$a%' 4"a)n)ng2
Adv:
/ :ore number of elements can be inserted as it uses linked lists.
Ais Adv.
/ The elements are evenly distributed. Bome elements may have more
elements and some may not have anything.
/ 2t re4uires pointers. This leads to slo0 the algorithm do0n a bit because of
the time re4uired to allocate ne0 cells, and also essentially re4uires the
implementation of a second data structure.
12 8"a% d# /#u m'an :/ #-'n add$'(()ng9
Open addressing is a collision resolving strategy in 0hich, if collision occurs
alternative cells are tried until an empty cell is found. The cells h-*x+, h*x+, h%*x+,V. are
tried in succession, 0here hi*x+7*8ash*x+;(*i++mod Tablesi1e 0ith (*-+7-. The function
( is the collision resolution strategy.
2 8"a% d# /#u m'an :/ P$#:)ng9
Drobing is the process of getting next available hash table array cell.
<2 8"a% d# /#u m'an :/ l)n'a$ -$#:)ng9
!inear probing is an open addressing collision resolution strategy in 0hich ( is a
linear function of i, (*i+7i. This amounts to trying se4uentially in search of an empty cell.
2f the table is big enough, a free cell can al0ays be found, but the time to do so can get
4uite large.
=2 8"a% d# /#u m'an :/ -$)ma$/ 4lu(%'$)ng9
2n linear probing collision resolution strategy, even if the table is relatively
empty, blocks of occupied cells start forming. This effect is kno0n as primary
clustering means that any key hashes into the cluster 0ill re4uire several attempts
to resolve the collision and then it 0ill add to the cluster.
?2 8"a% d# /#u m'an :/ Guad$a%)4 -$#:)ng9
$uadratic probing is an open addressing collision resolution strategy in 0hich (*i+7i%.
There is no guarantee of finding an empty cell once the table gets half full if the table si1e
is not prime. This is because at most half of the table can be used as alternative locations
to resolve collisions.
@2 8"a% d# /#u m'an :/ ('4#nda$/ 4lu(%'$)ng9
Although 4uadratic probing eliminates primary clustering, elements that hash to the
same position 0ill probe the same alternative cells. This is kno0n as secondary
clustering.
A2 L)(% %"' l)m)%a%)#n( #; l)n'a$ -$#:)ng2
/ Time taken for finding the next available cell is large.
/ 2n linear probing, 0e come across a problem kno0n as clustering.
C2 M'n%)#n #n' adBan%ag' and d)(adBan%ag' #; u()ng Guad$a%)4 -$#:)ng2
Advantage: The problem of primary clustering is eliminated.
Aisadvantage: There is no guarantee of finding an unoccupied cell once the table
is nearly half full.
UNIT V D6NAMIC PROGRAMMING AND .AC1TRAC1ING
12 D';)n' G$a-"2
A graph 6 consist of a nonempty set F 0hich is a set of nodes of the graph, a set '
0hich is the set of edges of the graph, and a mapping from the set for edge ' to a set of
pairs of elements of F. 2t can also be represented as 67*F, '+.
2 8"a% )( m'an% :/ (%$#ngl/ and 8''kl/ 4#nn'4%'d )n a g$a-"9
An undirected graph is connected, if there is a path from every vertex to every
other vertex. A directed graph 0ith this property is called strongly connected.
@hen a directed graph is not strongly connected but the underlying graph is
connected, then the graph is said to be 0eakly connected.
<2 L)(% %"' %*# )m-#$%an% k'/ -#)n%( #; d'-%" ;)$(% ('a$4"2
i+ 2f path exists from one node to another node, 0alk across the edge exploring
the edge.
ii+ 2f path does not exist from one specific node to any other node, return to the
previous node 0here 0e have been before backtracking.
=2 8"a% d# /#u m'an :/ :$'ad%" ;)$(% ('a$4" +.FS,9
B(B performs simultaneous explorations starting from a common point and
spreading out independently.
?2 D);;'$'n%)a%' .FS and DFS2
"o. A(B B(B
. Backtracking is possible from a
dead end
Backtracking is not possible
%. Fertices from 0hich exploration is
incomplete are processed in a
!2(O order
The vertices to be explored are
organi1ed as a
(2(O 4ueue
&. Bearch is done in one particular
Airection
The vertices in the same level are
maintained
parallely
@2 8"a% d# /#u m'an :/ a$%)4ula%)#n -#)n%9
2f a graph is not biconnected, the vertices 0hose removal 0ould disconnect the
graph are kno0n as articulation points.
A2 8"a% )( a (%a%' (-a4' %$''9
The processing of backtracking is implemented by constructing a tree of choices
being made. This is called the state#space tree. 2ts root represents a initial state before the
search for a solution begins. The nodes of the first level in the tree represent the choices
made for the first component of the solution, the nodes in the second level represent the
choices for the second component and so on.
C2 8"a% )( a -$#m)()ng n#d' )n %"' (%a%'!(-a4' %$''9
A node in a state#space tree is said to be promising if it corresponds to a partially
constructed solution that may still lead to a complete solution.
92 8"a% )( a n#n!-$#m)()ng n#d' )n %"' (%a%'!(-a4' %$''9
A node in a state#space tree is said to be promising if it corresponds to a partially
constructed solution that may still lead to a complete solutionK other0ise it is called non#
promising.
1F2 8"a% d# l'aB'( )n %"' (%a%' (-a4' %$'' $'-$'('n%9
!eaves in the state#space tree represent either non#promising dead ends or
complete solutions found by the algorithm.
112 8"a% )( d/nam)4 -$#g$amm)ng and *"# d)(4#B'$'d )%9
Aynamic programming is a techni4ue for solving problems 0ith overlapping
subproblems. These subproblems arise from a recurrence relating a solution to a given
problem 0ith solutions to its smaller subproblems only once and recording the results in a
table from 0hich the solution to the original problem is obtained. 2t 0as invented by a
prominent E.B :athematician, Jichard Bellman in the Z>-s.
12 8"a% )( :a4k%$a4k)ng9
Backtracking constructs solutions one component at a time and such partially
constructed solutions are evaluated as follo0s
i. 2f a partially constructed solution can be developed further 0ithout
violating the problem5s constraints, it is done by taking the first remaining
legitimate option for the next component.
ii.2f there is no legitimate option for the next component, no alternatives for the
remaining component need to be considered. 2n this case, the algorithm backtracks to
replace the last component of the partially constructed solution 0ith its next option.
1<2 8"a% )( %"' mann'$ )n *")4" %"' (%a%'!(-a4' %$'' ;#$ a :a4k%$a4k)ng alg#$)%"m )(
4#n(%$u4%'d9
2n the maCority of cases, a state#space tree for backtracking algorithm is
constructed in the manner of depth#first search. 2f the current node is promising, its child
is generated by adding the first remaining legitimate option for the next component of a
solution, and the processing moves to this child. 2f the current node turns out to be non#
promising, the algorithm backtracks to the node5s parent to consider the next possible
solution to the problem, it either stops or backtracks to continue searching for other
possible solutions.
1=2 H#* 4an %"' #u%-u% #; a :a4k%$a4k)ng alg#$)%"m :' %"#ug"% #;9
The output of a backtracking algorithm can be thought of as an n#tuple *x, Vxn+
0here each coordinate xi is an element of some finite linearly ordered set Bi. 2f such a
tuple *x, Vxi+ is not a solution, the algorithm finds the next element in Bi; that is
consistent 0ith the values of *x, Vxi+ and the problem5s constraints and adds it to the
tuple as its *2;+st coordinate. 2f such an element does not exist, the algorithm backtracks
to consider the next value of xi, and so on.
1?2 G)B' a %'m-la%' ;#$ a g'n'$)4 :a4k%$a4k)ng alg#$)%"m2
A!6OJ2T8:
Backtrack
*GL..iM+ ==6ives a template of a generic backtracking algorithm ==2nput GL..iM
specifies the first 2 promising components of a solution ==Output All the tuples
representing the problem5s solution if GL..iM is a solution 0rite GL..iM else for each
element xLBi; Mconsistent 0ith GL..iM and the constraints do GLi;M x
Backtrack*GL..i;M
1@2 8$)%' a $'4u$()B' alg#$)%"m ;#$ (#lB)ng T#*'$ #; Han#) -$#:l'm2 A!6OJ2T8:
To move n< disks from peg to peg&, 0ith peg% as auxiliary, first move
recursively n# disks from peg to peg% 0ith peg& as auxiliary.
Then move the largest disk directly from peg to peg&
(inally move recursively n# disks from peg% to peg& 0ith peg as
auxiliary
2f n7 simply move the single disk from source peg to destination peg.
1A2 8"a% )( %"' :a()4 #-'$a%)#n )n %"' T#*'$ #; Han#) -$#:l'm and g)B' %"'
$'4u$$'n4' $'la%)#n ;#$ %"' num:'$ #; m#B'(9
The moving of disks is considered the basic operation in the To0er of 8anoi
problem and the recurrence relation for the number of moves is given as :*n+7%:*n+;
for n< :*+7
1C2 8"a% )( n!Gu''n( -$#:l'm9
The problem is to place Hn5 4ueens on an n#by#n chessboard so that no t0o
4ueens attack each other by being in the same ro0 or in the column or in the same
diagonal.

19D';)n' %"' Ham)l%#n)an 4)$4u)%2
The 8amiltonian is defined as a cycle that passes through all the vertices of the
graph exactly once. 2t is named after the 2rish mathematician Bir @illiam Jo0an
8amilton *d->#dP>+.2t is a se4uence of n; adCacent vertices vi-, vi,VV, vin#, vi-
0here the first vertex of the se4uence is same as the last one 0hile all the other n#
vertices are distinct.
F2 8"a% )( %"' m'%"#d u('d %# ;)nd %"' (#lu%)#n )n n!Gu''n -$#:l'm :/ (/mm'%$/9
The board of the n#4ueens problem has several symmetries so that some solutions
can be obtained by other reflections. Dlacements in the last n=% columns need not be
considered, because any solution 0ith the first 4ueen in s4uare *,i+, n=%IiIn can be
obtained by reflection from a solution 0ith the first 4ueen in s4uare *,n#i;+

12 8"a% a$' %"' add)%)#nal ;'a%u$'( $'Gu)$'d )n :$an4"!and!:#und *"'n 4#m-a$'d
%# :a4k%$a4k)ng9
Compared to backtracking, branch#and#bound re4uires:
i.A 0ay to provide, for every node of a state space tree, a bound on the best value
of the obCective function on any solution that can be obtained by adding further
components to the partial solution represented by the node.
ii.The value of the best solution seen so far
2 8"a% )( kna-(a4k -$#:l'm9
6iven n items of kno0n 0eights 0i and values vi, i7,%,V,n, and a knapsack of capacity
@, find the most valuable subset of the items that fit the knapsack. 2t is convenient to
order the items of a given instance in descending order by their value#to#0eight ratios.
Then the first item gives the best payoff per 0eight unit and the last one gives the 0orst
payoff per 0eight unit.
<2 G)B' %"' ;#$mula u('d %# ;)nd %"' u--'$ :#und ;#$ kna-(a4k -$#:l'm2
A simple 0ay to find the upper bound Hub5 is to add Hv5, the total value of the
items already selected, the product of the remaining capacity of the knapsack @#0 and
the best per unit payoff among the remaining items, 0hich is v
i;=0i; ub 7 v ; *@#0+* vi;=0i;+
=2 8"a% )( %"' %$aB'l)ng (al'(man -$#:l'm9
The problem can be modeled as a 0eighted graph, 0ith the graph5s vertices
representing the cities and the edge 0eights specifying the distances. Then the problem
can be stated as finding the shortest 8amiltonian circuit of the graph, 0here the
8amiltonian is defined as a cycle that passes through all the vertices of the graph exactly
once.
?2 8"a% a$' %"' (%$'ng%"( #; :a4k%$a4k)ng and :$an4"!and!:#und9
The strengths are as follo0s
i.2t is typically applied to difficult combinatorial problems for 0hich no efficient
algorithm for finding exact solution possibly exist
ii.2t holds hope for solving some instances of nontrivial si1es in an acceptable
amount of time
iii.'ven if it does not eliminate any elements of a problem5s state space and ends
up generating all its elements, it provides a specific techni4ue for doing so, 0hich can be
of some value.

You might also like