You are on page 1of 18

Turing Machine Variations

Chapter 3.2

Chapter 3.2

Turing Machine Variations


The stay put TM: S : Q Q {L, R, S} This machine can easily be converted to our standard TM. For each S (qi, cj ) = (qk , cl , S) add the following pair of transitions to : (qi, cj ) = (qk , cl , R) (qk , cl ) = (qk , cl , L) Note: Each variation on our basic TM does not add more robustness.

Chapter 3.2

Turing Machine Variations


Multi-tape TM: Like a regular TM but with k tapes each with its own head for reading and writing. Initially, the input is a Tape 1, and the other tapes start out blank. M : Q k Q k {L, R}k Example: M (qi, a1, ..., ak ) = (qj , b1, ..., bk , L, ..., R) Theorem 3.8: Every multi-tape TM has an equivalent single tape TM. Proof Sketch: Convert the multi-tape TM M to an equivalent single tape TM, S. That is, show how to simulate M with S.

Chapter 3.2

Turing Machine Variations


M 0 1 0 1 0 a a a b a S ... ... ...

# 0 1 0 10 # a a a # b a #

...

M = {0, 1, a, b, } S = {0, 1, a, b, , #, 0, 1, a, b, }

Chapter 3.2

Turing Machine Variations


S = On input string w = w1...wn: 1. First S puts its tape into the format that represents all k tapes of M . The formatted tape of S contains: #w1w2...wn## ... # 2. To simulate a single move of M , S scans its tape from the rst #, which marks the left-hand end, to the (k + 1)st #, which marks the right-hand end, in order to determine the symbols under the virtual heads. Then S makes a second pass to update the tapes according to the way that M s transition function dictates. 3. If at any point S moves one the virtual heads to the right onto a #, this action signies that M has moved the corresponding head onto the previously unread blank portion of that tape. So S writes a blank symbol on this tape cell and shifts the tape contents, from this cell until the rightmost #, one unit to the right. Then S continues as before.
Chapter 3.2 5

Turing Machine Variations


S S # 0 1 0 10 # a a a # b a #

. .

...

# 0 1 0 10

# a a a # b a #

...

Recall: Defn: A language is Turing regonizable if some TM recognizes it.

Chapter 3.2

Turing Machine Variations


Corollary 3.9: A language L is Turing-recognizable i some multi-tape TM recognizes it. Proof: If a language L is Turing-recognizable, then some multi-tape TM recognizes it. Assume L is Turing-recognizable. By deniton, L is recognized by an ordinary, single-tape TM. This TM is just a multi-tape TM with one tape. So L is recognized by a multi-tape machine. Proof: If some multi-tape TM recognizes L, then L is Turing-recognizable. Must show that any multi-tape TM can be constructed as a single-tape TM. Theorem 3.8 shows this.

Chapter 3.2

Turing Machine Variations


Nondeterministic TMs At any point in its computation, the TM may proceed in a number of ways. N : Q P (Q {L, R}) Where there is a choice point, a new TM is spawned. The TMs form a tree. If one of the TMs succeeds (enters an accept state), then the nondeterministic machine accepts its input.

Chapter 3.2

Turing Machine Variations


Theorem 3.10: Every nondeterministic TM has an equivalent deterministic TM. Proof Idea: Keep a tree of the nondeterministc TMs and search it breadthrst.
N
1

N N
13

N N
31

N
32

11

12

21

Why not search depth-rst? Must execute each machine, adding one step at a time.

Chapter 3.2

Turing Machine Variations


Theorem 3.10: Every nondeterministic TM has an equivalent deterministic TM. Proof Idea: For D, the deterministic version of N , use three tapes. 1. Tape 1 - input string 2. Tape 2 - simulation tape of a version of N run for k steps 3. Tape 3 - address tape
0 0 1 0 D
q1

... D
q4

0 0 1 0

... D
q7

0 0 1 0

... D
q1

0 0 1 0

...

0 0 1 0 1 3 ...

# 0 1 0 1 3 ...

# # 1 0 1 3 ...

0 0 1 0 2 1 ...

Chapter 3.2

10

Turing Machine Variations


Theorem 3.10: Every nondeterministic TM has an equivalent deterministic TM. Description of Ds operation: 1. Initially, tape 1 contains w and tapes 2 and 3 are empty, in start state. 2. Copy tape 1 to tape 2 3. Use tape 2 to simulate N with input w on one branch of N s computation. Before each step on this branch, look at the next symbol on tape 3 to determine which choice to make among those that are possible. If no more symbols are on tape 3 (at a blank) or the choice is invalid, abort this computation by going to stage 4. Also go to stage 4 if a rejecting conguration is encountered. If an accepting conguration is encountered, accept the input/ 4. Replace the string on tape 3 with the lexicographically next string, and go to stage 2.
Chapter 3.2 11

Turing Machine Variations


Corollary 3.11: A language is Turing-recognizable i some nondeterministic TM recognizes it. If L is Turing-recognizable, then some nondeterministic TM recognizes it. All deterministic TMs are also nondeterministic. If some nondeterministic TM recognizes L, then L is Turing-recognizable. Use Theorem 3.10. Corollary 3.12: A language is decidable i some nondeterministic TM decides it. If L is decidable, then some nondeterministic TM decides it. All deterministic TMs are also nondeterministic. If some nondeterministic TM decides L, then L is decidable. Use Theorem 3.10.

Chapter 3.2

12

Turing Machine Variations


Enumerators A TM with a printer: The language enumerated by E is all strings printed to the printer. Strings can be repeated or in any order.

aa baba abba

printer control 0 0 1 0 ...

Theorem 3.13: A language is Turing-recognizable i some enumerator enumerates it.

Chapter 3.2

13

Turing Machine Variations


Theorem 3.13: A language is Turing-recognizable i some enumerator enumerates it. Proof: If an enumerator E enumerates a language A, then A is Turing-recognizable. We need to construct M , the TM that recognizes A. TM M works the following way. M = On input w: 1. Run E. Every time that E outputs a string, compare it with w. 2. If this output of E matches w, accept, else go to 1. Note: We could use the TM of Example 3.5 to do the comparison of each string that E enumerates to w. Question: Is this decidable?
Chapter 3.2 14

Turing Machine Variations


Theorem 3.13: A language is Turing-recognizable i some enumerator enumerates it. Proof: If language A is Turing-recognizable, then some enumerator enumerates it. In this direction, we need to show that if some TM M recognizes A then an enumerator E can be constructed to enumerate A. A must be over some alphabet . That is, A . Let s1, s2, s3, ... be an innite list of all the strings in . E = Ignore the input. 1. For i = 1, 2, 3, ... (a) Run M for i steps on each input s1, ..., si. (b) If M accepts any of these strings, print it.
Chapter 3.2 15

Turing Machine Variations


Example: Run M for i Steps, i = Strings Tested 1 s1 2 s1 , s2 3 s1 , s2 , s3 4 s1,s2,s3, s4 5 s1,s2,s3, s4,s5 6 s1,s2,s3, s4,s5, s6 7 s1,s2,s3, s4,s5, s6, s7 8 s1,s2,s3, s4,s5, s6, s7 ... ... Printer output

s2 s2 , s5 s2 , s3 , s5 s2 , s3 , s5 s1 , s2 , s3 , s5 ...

If M accepts some string, it will eventually get printed out. Why are we running M this way?

Chapter 3.2

16

Turing Machine Variations


Summary: There are many other models of general purpose computation that have been proposed. Example: How about a PDA with a deque? Peter Joachim, contributor All have the same features as a TM (or less), and that is: 1. Unrestricted access to a limitless memory FA - limited memory PDA - limitless memory, but restricted access 2. All perform a nite amount of work in a given step All machines with the above two features have the same power of computation as a TM.
Chapter 3.2 17

Turing Machine Variations


What does this mean? 1. Any model of computation can be simulated by a TM and vice versa. 2. Any algorithm that runs on any other machine can be run on a TM. BIG IMPLICATION: Since all computational models can simulate each other and compute all the same algorithms, we can describe an algorithm as being able to be run on a machine. This is a unique, precise description of the class algorithm. A precise denition of algorithm was not worked out until Turing and Church came along.

Chapter 3.2

18

You might also like