Close
Register
Close Window

CS4114 Coursenotes

Chapter 8 Week 10

Show Source |    | About   «  8.3. Combining Turing Machines   ::   Contents

8.4. Turing Machine Extensions

8.4.1. More Powerful Machines?

Turing’s Thesis claims that TM is as powerful as “any machine”
We might add features to a TM.
If the features let us do something that we could not do before, that disproves Turing’s Thesis.
If we can simulate the new feature with the standard model, this adds support to (but does not prove) the Turing thesis.
As usual, we say that two machine types are equivalent if they
Accept the same languages
Each can simulate the other (one direction is often trival)

8.4.2. Extensions (1)

2-way infinite tape (Our model) vs 1-way infinite tape
Just bend infinite tape in the middle to get a one-way tape, but with two layers. Now, expand the language. The new language is ordered pairs of the old language, to encode two levels of tape.
Some presentations of TMs use one-way infinite tape to clarify why you can chain machines.

8.4.3. Extensions (2)

Multiple tapes
Again, expanded alphabet collapses multipe symbols to 1.
Multiple heads on one tape
Encode the heads onto the tape, and simulate moving them around.

8.4.4. Extensions (3)

Two-dimensional “tape”
Convert to 1D, by diagonals.

8.4.5. Extensions (4)

Non-determinism
Simulate nondeterministic behavior in sequence, doing all computations of length 1, then computations of length 2, etc., until we reach a halt state for one of the non-deteriministic choices.
Non-determinism gives us speed, not ability.

8.4.6. Linear Bounded Automata

We could restrict the general model for a TM:
Instead of an infinite tape, the tape might be only as long as the input
Alternatively: \(c*n\) for constant \(c\) and input length \(n\)
(can double space by simulating two tracks by augmenting the alphabet)
Linear Bounded Automata [LBA]
\(L = \{a^nb^nc^n \mid n \geq 1\}\) can be accepted by an LBA.
So, LBA more powerful than pushdown automata.
But turns out to be less powerful than TM with unrestricted space (but this is hard to prove)

8.4.7. A Universal Turing Machine

A Turing Machine that takes a description for a Turing Machine and an input string, and simulates the behavior of that machine on that string.
Need three things:
We need to encode the input machine as a string
We need to encode the input to the machine as a string
We need to encode the current state of operations on the input machine.
Might be easiest to think of these as being on separate tapes.

8.4.8. Recursive Enumerable vs. Recursive

Definition: A language is Recursively Enumerable if there is a Turing Machine that accepts it. [Turing Acceptable]
Definition: A language is Recursive if there is a Turing Machine that accepts it and that halts on every input string. [Turing Decideable]
The terminology of “enumerable” comes from the fact that it is possible to both “count” the number of strings in the language (countably infinite) and to put them in some order.
More on this later!

8.4.9. More-general Grammars

Unrestricted Grammars: Has productions \(u \rightarrow v\) where \(u\) is in \((V \cup T)^+\) and \(v\) is in \((V \cup T)^*\).

Context Sensitive Grammars: All productions are of the form \(u \rightarrow v\) where \(u, v \in (V \cup T)^+\) and \(|u| \leq |v|\).
“Noncontracting”
Called “context sensitive” because they can always be rewritten so that all productions are in the form \(xAy \rightarrow xvy\) for \(v \in (V \cup T)^*\).
We already know that CSG is “richer” than CFG.

8.4.10. The Language Hierarchy

Turing Acceptable (Recur Enum) Language == Unrestricted Grammar (Turing Acceptable)
Turing Decideable (Recursive) Language == Turing Decideable
Context-sensitive Grammar == Linear Bounded Automata
Context-free Grammar == Non-deterministic Pushdown Automata
Deterministic Context-free Grammar == Deterministic Pushdown Automata
Regular Expression == Regular Grammar == DFA == NFA

These are all proper subset relationships

   «  8.3. Combining Turing Machines   ::   Contents

nsf
Close Window