Professional Documents
Culture Documents
Time complexity
Dammit Im mad!
is a palindrome
In 1993, noted comedian Demetri Martin took a math course at Yale called Fractal Geometry. His final project: a 225-word palindromic poem.
In 1993, noted comedian Demetri Martin took a math course at Yale called Fractal Geometry. His final project: a 225-word palindromic poem.
What does that have to do with fractals? I dont know, its a liberal arts school.
Thats nothing.
Rats peed on hope. Elsewhere dips a web. Be still if I fill its ebb. Ew, a spider ... eh? We sleep.
Oh no! Ah, say burning is as a deified gulp Deep, stark cuts saw it in one position. in my halo of a mired rum tin. Part animal, can I live? Sin is a name. I erase many men. Oh, to be man, a sin. Both, one ... my names are in it. Murder? Is evil in a clam? In a trap? I'm a fool. A hymn I plug, No. It is open. Deified as a sign in ruby ash - a goddam level I lived at. On it I was stuck. On mail let it in. I'm it. Oh, sit in ample hot spots. Oh, wet! A loss it is alas (sip). I'd assign it a name. Name not one bottle minus an ode by me: "Sir, I deliver. I'm a dog." Evil is a deed as I live. Dammit I'm mad.
In 1986, one Lawrence Levine wrote an entire palindromic novel. It had ~100,000 letters.
Suppose you are the proofreader. How would you check if theres a mistake?
Tacit, I hate gas (aroma of evil), masonry, tramps, a wasp martyr. Remote liberal ceding is idle if... heh-heh, Sam X. Xmas murmured in an undertone to tow-trucker Edwards. Alas. Simple hot." To didos, no tracks, Ed decided. Or eh trucks abob.
Tacit, I hate gas (aroma of evil), masonry, tramps, a wasp martyr. Remote liberal ceding is idle if... heh-heh, Sam X. Xmas murmured in an undertone to tow-trucker Edwards. Alas. Simple hot." To didos, no tracks, Ed decided. Or eh trucks abob.
(160 pages)
Bob, ask Curt. He rode diced desk carton. So did Otto help Miss Alas draw Derek-cur. Two tote? Not red Nun. A nide. Rum. Rum Sam X. Xmas. Heh, heh. Field, I sign. I declare bile to merry tramps. A wasp martyr? No, Sam live foam or a sage Tahiti Cat.
(160 pages)
Bob, ask Curt. He rode diced desk carton. So did Otto help Miss Alas draw Derek-cur. Two tote? Not red Nun. A nide. Rum. Rum Sam X. Xmas. Heh, heh. Field, I sign. I declare bile to merry tramps. A wasp martyr? No, Sam live foam or a sage Tahiti Cat.
TwoFingersPalindromeTest(S,n)
// returns Yes iff string // S[1]...S[n] is a palindrome lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
The
Today:
9 Great Ideas
in Theoretical Computer Science
Problems
PALINDROMES is a problem.
dammitimmad is an instance. (also known as an input) More instances: selfless zxckallkdsflsdkf parahazramarzaharap No No Yes Solution Yes
Problems
Problems
Another example: Instances 15251 252 12345679 9 50 610 610 25
Problems
Another example:
MULTIPLICATION
Solutions 3843252 111111111 30500 15250
Chess?
Question: Is this a winning position for white? An interesting question, but its not a Problem
Problems
Lets try again.
Problems
Lets try again.
CHESS
Instance: An arbitrary position. Solution: Yes/No, is it a winning position for white? Only finitely many instances still not a Problem
GENERALIZEDCHESS
Instance: A board size and an arbitrary position. Solution: Yes/No, is it a winning position for white? Yes! Thats a problem!
Algorithms
A well-defined procedure which gets an input (instance), gives an output. I think you know what I mean. (But see Lecture 22.) In 251, we write all our algorithms in pseudocode. Definition:
Algorithms
Algorithm
A solves problem R
Algorithms
An algorithm is a finite answer to an infinite number of questions
Great Idea #2: Input size
Stephen Kleene
Instance/input size
Usually denoted
# of bits of M
Unless otherwise specified: = # bits needed to specify input. Its often otherwise specified! Formally, what n means is part of the definition of the problem.
A on instance I
1
A on instance I
1 1
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
Suppose instance
I is: selfless
Suppose instance
I is: selfless
A on instance I
1 1 1
A on instance I
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
TwoFingers(S,n)
lo := 1 1 hi := n 1 while (lo < hi) 1 if S[lo] S[hi] then 3? return No Were just lo := lo + 1 making this up hi := hi 1 end while return Yes
Suppose instance
I is: selfless
Suppose instance
I is: selfless
A on instance I
1 1 1 3
A on instance I
1 1 1 3 1
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
Suppose instance
I is: selfless
Suppose instance
I is: selfless
A on instance I
1 1 1 3 1 1
A on instance I
1 1 1 3 1 1 1
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
Suppose instance
I is: selfless
Suppose instance
I is: selfless
A on instance I
1 1 1 3 1 1 1
A on instance I
1 1 1 3 1 1 1
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
1
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
1 3
Suppose instance
I is: selfless
Suppose instance
I is: selfless
A on instance I
1 1 1 3 1 1 1
A on instance I
1 1 1 3 1 1 1 1 1
+a few
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
1 3 1
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
14
Suppose instance
I is: selfless
and
steps.
A on instance I
1 1 1 3 1 1 1 1 1
TwoFingers(S,n)
lo := 1 hi := n while (lo < hi) if S[lo] S[hi] then return No lo := lo + 1 hi := hi 1 end while return Yes
instances I of size n
and
O(n) steps.
CLOSEST-PAIR
List of (at least 2) integers
CLOSEST-PAIR
Input size: n is defined to be length of list Solution: Distance between closest pair in list
A simple algorithm:
MyAlg(L)
closestSoFar := |L[1] L[2]| for i = 1...n for j = 1...n diff := |L[i] L[j]| if diff < closestSoFar then closestSoFar := diff return closestSoFar
MyAlg(L)
closestSoFar := |L[1] L[2]| for i = 1...n for j = 1...n diff := |L[i] L[j]| if diff < closestSoFar then closestSoFar := diff return closestSoFar
MyAlg(L)
closestSoFar := |L[1] L[2]| for i = 1...n for j = i+1...n diff := |L[i] L[j]| if diff < closestSoFar then closestSoFar := diff return closestSoFar
On input
When it comes to running time, focus on the big picture: how it scales as a function of n.
is not a very meaningful statement analogous to too many significant digits overly depends on elementary step definition even at the level of machine code, were still ignoring constant factor time differences like processor vs. cache vs. disk speed
n2
4n 3n 2n n
100
s
1016
n!
2n n3 n2 n
104 108
Intrinsic complexity
one year one hour one sec
108
Intrinsic complexity
Given a problem, e.g., PALINDROMES, we can ask about its intrinsic complexity: How fast is its fastest algorithm?
PALINDROMES:
We know an O(n) algorithm, TwoFingers. Could there be a faster one? E.g., Theorem: Any alg. solving PALINDROMES uses n1 steps. Proof:
Let When Suppose algorithm
I be the string aaaaa (n times), which is a palindrome. A runs with input I there must be distinct 1 j1,j2 n
such that
(Assuming a fixed model of elementary steps, and doing analysis up to O()s.) Let But
I be the same as I except that I[j1]=b and I[j2]=c. A runs on I it has same behavior as when it runs on I. (Why?) A says Yes on I and it says No on I (why?), a contradiction.
When
PALINDROMES:
We know an O(n) algorithm, TwoFingers. Could there be a faster one? E.g., Theorem: Any alg. solving PALINDROMES uses n1 steps. Conclusion: The intrinsic time complexity of PALINDROMES is linear; (n) time is necessary and sufficient.
CLOSEST-PAIR:
We know an O(n2) brute force algorithm. Is there a faster algorithm? Yes, use sorting! O(n log n) time. Not too hard theorem: n steps required. Is there an O(n) algorithm? Depends on your model of step-counting! Intrinsic complexity: linear / quasilinear.
10
MULTIPLICATION:
In grade school you learn an O(n2) algorithm. + =
MULTIPLICATION:
In grade school you learn an O(n2) algorithm. Easy to show n steps are required. Is there a faster algorithm? Yes! A much faster one.
HAMILTONIAN-CYCLE:
Instance: Solution: A connected graph. Yes/No: Is there a tour visiting each vertex exactly once? Instance size: n = # of vertices.
HAMILTONIAN-CYCLE:
Brute-force alg: Try all tours n! steps [Held-Karp70]: Dynamic programming 2n steps [Bjrklund10]: Clever algebraic brute-force 1.657n steps
EULERIAN-CYCLE:
Instance: Solution: A connected graph. Yes/No: Is there a tour visiting each edge exactly once? Instance size: n = # of vertices.
EULERIAN-CYCLE:
Algorithm
E:
Check if every vertex has even degree. If so, output Yes. Else output No. Eulers Theorem: Time: Alg.
E solves EULERIAN-CYCLE.
TimeE(n) = O(n2).
11
Polynomial time.
HAMILTONIAN-CYCLE:
s
1016
n!
2n n3 n2 n
age of universe
Seems to require exponential time. We have no good understanding of which graphs have Hamiltonian cycles.
108
EULERIAN-CYCLE:
Polynomial time. Eulers Theorem explains Eulerian cycles. There is an enormous efficiency chasm between polynomial and exponential time.
104
108
HAMILTONIAN-CYCLE:
Seems to require exponential time. We have no good understanding of which graphs have Hamiltonian cycles.
EULERIAN-CYCLE:
Polynomial time. Eulers Theorem explains Eulerian cycles. There is an enormous understanding chasm between polynomial and exponential time.
12
Polynomial time
50 years of computer science experience shows its a very compelling definition:
A necessary first step towards truly efficient algorithms, associated with beating brute-force Very robust to notion of what is an elementary step. Easy to work with: Plug a poly-time subroutine into a poly-time algorithm: still poly-time. (Not true for quasilinear time.) Empirically, it seems that most natural problems with poly-time algorithms also have efficient-in-practice algorithms.
Its a negatable benchmark: Not polynomial time pretty much implies not efficient.
Definition recall(?)
Great Idea #7:
Let f, g : .
f(n) = O( g(n) )
Big-O notation.
means (informally)
Definition recall(?)
Let f, g : .
Example:
f(n) = O( g(n) )
means
g(n)
13
WARNING:
In the expression f(n) = O(g(n)), the equals sign (=) does not really mean equals. Its just tradition to write it that way. You can define O() with sets and write if you really want.
n
4n2 for n 13
f(n) =
O(n2)
because f(n)
13
Example:
Big
If O() is like roughly , () is like roughly .
A stronger statement:
f(n) = ( g(n) )
means |g(n)| n n0
Big
() is like roughly =.
2n 3n n! nn
f(n) = ( g(n) )
C1, C2 > 0, n0 such that
means
If you want to be careful about how many steps an algorithm takes, then youll have to be careful.
14
hi = 100000000000000
O(n) ?
hi = 100000000000000 hi = 011111111111111
Didnt that take (log n) steps?
O(n log n) ?
lo = 1, hi = n.
Does it take the disk / memory-pointer / bus n steps to get from S[1] to S[n]?
O(n2) ?
15
RAM model
Good combination of reality/simplicity. On input size n, assume data is stored in words / registers of size O(log n). (So you can store an array index in 1 word.) Indirect memory accesses, like S[i], cost 1. Any oper. on words (like hi:=hi-1) costs 1. All reasonable models of step-counting for algorithms are polynomially equivalent.
E.g., it is a straightforward theorem that Turing Machines can simulate RAM model with at most polynomial slowdown, & vice versa.
In light of research from 1980s We believe (cant prove) that the Strong ChurchTuring Thesis holds true even with randomized computation.
16
In light of research from 1990s We believe (cant prove) that the Strong ChurchTuring Thesis is not true.
Definitions: problems, instances, algorithms, input size, running time Understand: polynomial time run-time scaling
Study Guide
How-to: count algorithm steps use O() notation prove (n) time is necessary
17