Last modified: 17-May-11.

The short answer: I don't know.

During the literature search for my Master's thesis, I sifted through a wide variety of grammars, both adaptive and nonadaptive. The similarities between imperative adaptive grammars and automata shortly led me to realize that I don't know what a grammar is. (In fact, I recognized this difficulty months before I had the sense to wonder what makes a grammar adaptive.) As with adaptive vs. nonadaptive grammars, I have yet to find any formalism whose classification as grammar vs. automaton isn't intuitively clear; but whereas I found a plausible answer to the adaptive/nonadaptive question, after several years of admittedly sporadic investigation I still can't justify my belief that a Turing Machine is not a grammar.

The most nearly successful heuristic I've found so far concerns the interactions between metasyntax, syntax, and semantics. Syntactic values are the terminal symbols and strings that make up the language generated by the grammar. Metasyntactic values, in the sense I'm using here, are descriptive of syntax, as for example the nonterminals of a Chomsky grammar. Semantic values carry information about the generated syntax beyond what is carried by the metasyntax.

Consider a complete derivation under an (extended) attribute grammar. The metasyntactic values are the nonterminals; the syntactic values are the terminals; and the semantic values are the attribute values attached to the nonterminals. In the completed derivation, metasyntax and semantics occur at the left-hand side, while syntax occurs at the right-hand side.

Contrast this with a complete computation by a Turing Machine. The metasyntactic values are the machine states, the syntax is the initial content of the tape, and the semantics is the halting content of the tape. In the completed computation, metasyntax occurs at the extreme left-hand side, as before; but now syntax is at the left and semantics at the right.

The straightforward generalization of this observation would be that if metasyntax and semantics are on the left and syntax on the right, it's a grammar; whereas if metasyntax and syntax are on the left and semantics on the right, it's an automaton. Unfortunately, not only is there a counterexample, but it's of my own making: RAGs are a formalism in which metasyntax is always on the left, but syntax and semantics can be on either side so long as they are opposite each other.