Моя проблема в том, как я могу доказать, что грамматика однозначна? У меня есть следующие грамматики:
и сделать это к однозначной грамматике, я думаю, что это правильно:
I know that a unambiguous grammar has one parse tree for every term.
Моя проблема в том, как я могу доказать, что грамматика однозначна? У меня есть следующие грамматики:
и сделать это к однозначной грамматике, я думаю, что это правильно:
I know that a unambiguous grammar has one parse tree for every term.
There is (at least) one way to prove unambiguity of a grammar for language . It consists of two steps:
The first step is pretty clear: show that the grammar generates (at least) the words you want, that is correctness.
The second step shows that has as many syntax trees for words of length as has words of length -- with 1. this implies unambiguity. It uses the structure function of which goes back to Chomsky and Schützenberger [1], namely
with the number of syntax trees has for words of length . Of course you need to have for this to work.
The nice thing is that is (usually) easy to obtain for context-free languages, although finding a closed form for can be difficult. Transform into an equation system of functions with one variable per nonterminal:
This may look daunting but is really only a syntactical transformation as will become clear in the example. The idea is that generated terminal symbols are counted in the exponent of and because the system has the same form as , occurs as often in the sum as terminals can be generated by . Check Kuich [2] for details.
Solving this equation system (computer algebra!) yields ; now you "only" have to pull the coefficient (in closed, general form). The TCS Cheat Sheet and computer algebra can often do so.
Consider the simple grammar with rules
.
It is clear that (step 1, proof by induction). There are palindromes of length if is even, otherwise.
Setting up the equation system yields
whose solution is
.
The coefficients of coincide with the numbers of palindromes, so is unambiguous.
This is a good question, but some Googling would have told you that there is no general method for deciding ambiguity, so you need to make your question more specific.
источник
For some grammars, a proof by induction (over word length) is possible.
Consider for example a grammarG over Σ={a,b} given by the following rules:
All words of length≤1 in L(G) -- there's only ε -- have only one left-derivation.
Assume that all words of length≤n for some n∈N have only one left-derivation.
Now consider arbitraryw=w1w′wn∈L(G)∩Σn for some n>0 . Clearly, w1∈Σ . If w1=a , we know that the first rule in every left-derivation has to be S→aSa ; if w1=b , it has to be S→bSb . This covers all cases. By induction hypothesis, we know that there is exactly one left-derivation for w′ . In combination, we conclude that there is exactly one left-derivation for w as well.
This becomes harder if
It may help to strengthen the claim to all sentential forms (if the grammar has no unproductive non-terminals) and "root" non-terminals.
I think the conversion to Greibach normal form maintains (un)ambiguity, to applying this step first may take care of left-recursion nicely.
The key is to identify one feature of every word that fixes (at least) one derivation step. The rest follows inductively.
источник
Basically, it's a child generation problem. Start with the first expression, and generate it's children .... Keep doing it recursively (DFS), and after quite a few iterations, see if you can generate the same expanded expression from two different children. If you are able to do that, it's ambiguous. There is no way to determine the running time of this algorithm though. Assume it's safe, after maybe generating 30 levels of children :) (Of course it could bomb on the 31st)
источник