You are on page 1of 7

Definitions:

Object Oriented Programming: programming approach in


which data occurs in packages, called the objects, and object
operations are executed with a member functions
Divide and Conquer: dividing original problem into
subproblems of the same or related type
ADT: defining a data type in terms of its operations, rather than
a specific implementation and can hide the details of a specific
implementation
PSN: for polish string notation the order of operations are
determined by the position of operators and operands, there is
no need for parenthesis, and no need to scan the entire
expression prior to evaluating
Recursion: process of solving a problem by solving smaller
problems of the same type
o Basis case: stopping case of which we can solve easily
o Recursive case: split the current problem into subproblems of the same type which we can solve easily
o Examples:
Hanoi tower
Is_palindrome
String reversal
Acyclic graph: no cycle; DAG = directed acyclic graph
Connected Graph: there exists a path from a vertex to every
other vertex
Strongly Connected graph (directed graph): there is a path
from a vertex to every other vertex
Complete graph: There exists an arc between each pair of
vertices
Classes in C++: member functions define how to store and
access objects of certain data type, class = data + member
function; constructor methods create the new object; Classes
support information hiding (data values are typically private and
can only manipulate class using public member functions)
o Differences between C and C++
Struct of C provides no privacy: data fields can be
inspected and modified freely anywhere

Time Complexity Theories


o Big O: The Big oh time complexity represents an
estimation for the worst case scenario based on the
number of operations that a program goes through
f(n) = O(g(n)) if k, n0 s.t.: | f(n) | k*g(n) n n0

o Omega: The Omega time complexity represents an


estimation for the best case scenario based on the amount
of operations that a program uses.
f(n) = (g(n)) if k, n0 s.t.: | f(n) | k*g(n) n n0
o Linear search: O(n)
o Binary search: O(log n)
o Merge sort: O(n log n)
o Bubble sort: O(n2)
Linear Data Structures
o Stack: linear data structure, last in first out, nodes
removed in reverse order they are added
Stack underflow: trying to access a node from an
empty stack
Stack overflow: trying to insert a node to full stack
Examples: run-time stack , converting infix to PSN,
Evaluating PSN, Checking for balanced parentheses
o Queue: linear data structure, first in first out, nodes
removed in same order they are added
Operations:
push() //adds to queue
pop() //removes from queue
is_empty() // T/F if its empty or not
size() // returns num of nodes
front() // returns the first node
o List:
Linked List: collection of items (nodes) arranged
one after another with all nodes connected to the
next object with a link field (pointer variable)

head_ptr -> 4.0 -> 3.0 -> 2.0 -> 1.0 -> NULL
o Node* head_ptr = new node(1.0, NULL);
o head_ptr = new node(2.0, head_ptr);
o head_ptr = new node(3.0, head_ptr);

o head_ptr = new node(4.0, head_ptr);


Sequential List:
Array:
o element at node [i] when root index is [0]
i1
can be found by [
]
2
o Left child (if exists) [2i+1]
o Right Child (if exists) [2i + 2]
Sequential vs Linked
Frequent access to nodes in the list: sequential
Frequent resizing is needed: linked
Frequent insertion into the middle: linked
Frequent deletion from the middle: linked
Operations occur at a cursor: linked
Operations occur at a 2-way cursor: double
linked list
Number of nodes is large: sequential
Dynamic Variable: not declared unlike ordinary variables and is
created during the execution of the program
o Memory allocation determined during the programs
execution
Pointer Variable: A variable which stores a pointer (memory
location) of some variable. Pinter variable is denoted by * before
the pointer variables name
o *p1 = 100; *p2 = 200
o delete p1; p1 = p2; //p1 and p2 both point to same
memory location (200)
o *p1 = 300; // *p1 and *p2 both point to same memory
location so they both = 300
Dynamic Data Structure: data structure whose size is
determined at the program execution
Static Data Structure: data structure whose size is determined
at the compilation time
Non Linear Data Structures
o Tree: finite set of elements with finite set of directed
edges; acyclic graph (root node has no parent); every node
except root node has only one parent; all nodes are
reachable from the rood node through unique path;
Binary Tree: every node has at most 2 children
Number of nodes = 2i
o Basis step i = 0; # of nodes can only be 1
o Hypothesis: Assume # nodes is 2i
o Induction: # of nodes on next level is 2i+1

Max # of nodes on next level is


2*2i which == 2i+1
Full Binary Tree:
max nodes = 2k+1 - 1 ??
Every internal node has two children
Every terminal node is placed on the last level
Complete Binary Tree
Subtree of depth k-1 is a full binary tree
Terminal nodes are placed from left to right
with no vacant space
Heap Tree:
Assume max heap unless otherwise stated
Key value at the root node is the largest
o Left subtree and right subtree are also
heap trees
Know how to build a heap tree from a sequence
of elements, as well as how to add and delete a
key value in heap tree

application of heap tree: pop root node and


then adjust other nodes and repeat until tree is
empty

Binary Search Tree


Right child >= left child
Know how to add and delete node
B tree of order k:
Root is either terminal or has at least 2
subtrees
**make sure to note what the order is (dont
simply assume it is order 5 like the majority of
the ones we have done in class)
Huffman Tree: a kind of binary tree
Motivation for Huffman code: data compression
and transfer efficiency and takes less memory
space to represent the text
Algorithm
o Organize character set into row
according to frequency
o Find two nodes with smallest combined
weight and join them to form a third node
whose weight is the combined weights of
the original 2
o Repeat step 2 until all nodes on every
level are combined into single tree
Know how to generate from a given distribution
(make sure to label edges)
**see last problem of hw 3
Graph Theory
o Topological Sort
Represents partial ordering relation??
Mathematical definition of partial ordering
Reflexive
Transitive
Anti-symmetrical
o Greedy Algorithms: Find the best selection at each stage,
then combine all of the best selections to find solution of
the problem
Dijkstras Algorithm
O(v2)
// v = number of vertices
Solves the shortest path problem
Kruksals Algorithm
Finds the minimum spanning tree for
connected weighted graph
O(elog e) or O(elog v)
// e = number of
edges; v = number of vertices

o Graph Traversal
Inorder: traverse left subtree, visit root, right
subtree
Preorder: visit root, left subtree, right subtree
Postorder: traverse left subtree, right subtree, root
Search
o DFS: Depth first search; Uses Stack
Go as deeply as possible, if terminal node, go up to
closest node
o BFS: Breadth first search; Uses Queue
From each vertex, visit all adjacent to it and then
repeat for all vertices (visit all nodes of same level
first)
o Sequential search
Worst case O(n) comparisons needed
Checks every element, from first to last, until the
desired element is found
o Binary search
Best Case: O(log n)
Worst Case: O(n)
Precondition: array should be sorted in ascending
order
Compares desired value to the center of the
sorted array then adjusts left or right limit
depending on whether or not it is too big or too
small
Hashing: take a data type and assign an integer value to it
(sorting it into array and can then be accessed based on that
integer value)
o Collision: 2 nonidentical data types are hashed into same
address
Ways around collision:
Propping: simply add it to the next open index
Chaining: adding data type to a linked list at
that index
Sort
o Insertion sort: O(n2)
Data Structure:
Starts at beginning then makes comparison (>, < ,
=) until it is failed (2 for loops, one to go through all
items needed to be sorted and the other to actually
do the sorting process)
o Selection sort:
Data Structure: Array

Find smallest item and place in sorted file, repeat


until all data have been sorted
2 examples:
straight selection sot
heap sort: O(nlog n)
o Merge sort:
Data Structure:
Divide and Conquer algorithm
Divides array into two half size arrays by recursive
calls
Quick sort does same divide and conquer method,
however
MS: dividing the array is simple, but combining
the two parts into single array is complicated
QS: dividing array is complicated, but merging
two parts is simple

You might also like