With the two operations + and defined as in table 1 R forms a commutative rig^{} with 0-element `?' and 1 -element ` = ' . The operation + can be understood as ``combining parallel information about a relation'', e.g. if we have a `?' y and a ` < ' y we have a (`?' + ` < ') y and that simplifies to a ` < ' y . The operation however is ``serial combination'', e.g. a ` < ' y and y ` = ' z can be combined into a (` < ' ` = ') z , simplified: a ` < ' z . `?' is neutral regarding + because it gives you no new information, whereas ` < ' is dominant because it is the strongest information. Regarding the relation ` = ' is neutral and `?' is dominant because it ``destroys'' all information. Check the table to see which relation overrides which.
+ | ` < ' | ` = ' | `?' |
` < ' | ` < ' | ` < ' | ` < ' |
` = ' | ` < ' | ` = ' | ` = ' |
`?' | ` < ' | ` = ' | `?' |
` < ' | ` = ' | `?' | |
` < ' | ` < ' | ` < ' | `?' |
` = ' | ` < ' | ` = ' | `?' |
`?' | `?' | `?' | `?' |
Now we can define multiplication on matrices over R as usual:
: R^{ n x m} x R^{ m x l} R^{ n x l}
(c_{ij}) =
Why is this a reasonable definition? Assume you have three sets of variables {x_{1},...,x_{n}} , {y_{1},...,y_{m}} and {z_{1},...,z_{l}} , a matrix A = (a_{ij}) R^{ n x m} reflecting the relations between the x_{i} s and the y_{i} s (i.d. a_{ij} = x_{i} y_{j} ) and a matrix B R^{ m x l} reflecting the relations between the y_{i} s and the z_{i} s. Then the matrix product C = AB reflects the relations between the x_{i} s and the z_{i} s. Becausec_{ij} = a_{i1} b_{1j} + a_{i2} b_{2j} + ... + a_{im} b_{mj},
we have e.g. x_{i} ` < ' z_{j} if we know it by intermediate variable y_{1} ( a_{i1} b_{1j} = ` < ' ) or by intermediate variable y_{2} or ... (to be continued).CM(n,m) : = {(a_{ij}) R^{ n x m} : ijk j(a_{ij} = `?'a_{ik} = `?')}
f(x,y) = if (x=0) then 0 else let a=min(x,y)-1 in f(a,x)Here the second call argument a is less than both x and y. The next proposition assures that all matrices foetus will have do deal with are call matrices.
: CM(n,m) x CM(m,l ) CM (n,l )
This operation is well defined.c_{ij} = a_{ik}b_{kj} = a_{i,k(i)}b_{k(i),j}
Now consider the i th row of C :c_{i} = (c_{ij})_{1 j l} = (a_{i,k(i)}b_{k(i),j})_{1 j l}
Because at most one b_{k(i),j} is unequal to `?' , at most one element of c_{i} is unequal to `?' . Therefore C CM (n,l ) .= {(f^{ (n)},g^{ (m)},A) : f^{ (n)} F^{ (n)},g^{ (m)} F^{ (m)},A CM(m,n)}
On calls we define the partial operation combination of callso : x
(f^{ (n)},h^{ (l)},BA)
Meaning: If g calls h with call matrix B and f calls g with call matrix A , then f indirectly calls h with call matrix BA . o cannot be applied to calls that have no ``common function'' like g , therefore it is partial. o can be expanded to sets of callso : () x () ()
(C,C') {c o _{}c' : c C,c' C',(c,c') Dom( o _{})}
Here we combine each call in C with each call in to which o _{} is applicable and form a set of the combined calls. o _{()} is a total function.E o E E
c E'n > 0,c_{1},...,c_{n} E : c_{1} o ... o c_{n} = c
d = d_{1} o ... o d_{n} | d_{1},...,d_{n} E | ||
e = e_{1} o ... o e_{m} | e_{1},...,e_{m} E |
= : E^{ n} E'' for all n.
Now be c E' . That implies c = c_{1} o ... o c_{n} ( c_{i} E ) for a suitable n . Hence c E^{ n} E'' . q e dE_{0} | = | E | |
E_{n + 1} | = | E_{n} (E_{n} o E) |
E' = E_{n} = E_{n + 1} = E_{n + 2} = ...
(Obviously the E_{n} grow monotonously.)Second: Because we have a finit set of starting edges E and therefore a finit set of reachable vertices and also a finit set of possible edges between two vertices (limited by the number of different call matrices of fixed dimensions) the E_{i} s cannot grow endlessly. Thus an n N exists with E_{n} = E_{n + 1} .
Third: We show that E' E_{n} for that particular n . Be c E' . Then there exists an m such that c = d_{1} o ... o d_{m} , therefore c E_{m} . Now if m n then E_{m} E_{n} , otherwise m > n and hence E_{m} = E_{n} , in both cases c E_{n} . q e d
E_{f(i)} : = {(C) : (f^{ (i)},f^{ (i)},C) E} R^{ i}
the recursion behaviour of function f^{ (i)}. ( takes the diagonal of square matrices).Each row of this set represents one possible recursive call of f^{ (i)} and how the orders of all parameters are altered in this call. The diagonals of the call matrices are taken because we want to know only how a parameter relates to its old value in the last call to f^{ (i)}. E_{f(i)} of course is a finite subset of R^{ i}.
In the following we identify lexical orders on parameters with permutations S_{n} of the arguments. Often not all of the parameters are relevant for termination; these are not listed in the lexical order and can appear in the permutation in any sequence.
In example 3.11 (fib') only the argument 0 has to be considered to prove termination, the order of argument 1 and 2 are irrelevant and therefore both permutations
=
and=
are valid continuations of the lexical order ``0''.r B1 k n : r_{(k)} = ` < '(1 i k : r_{(i)} = ` = ')
This definition is a very wide one. In most cases you will look for more special termination orders:
r B : r_{(0)} = ` < ' | |||
r B : r_{(0)} = `?' | |||
'_{ } S_{n - 1} termination order on B' : = {r'_{(0)} : r_{(0)} ` < '} R^{ n - 1} |
The algorithm implemented in foetus searches termination orders like in definition (2); it is a one-to-one transfer of this definition. Every termination order matching definition (2) also matches definition (1) and it can easily be shown that if there is a termination order of type (1) there also exists on of type (2).
Example 2986
Be E = {(` = ',` < ',`?'),(` = ',` = ',` < '),(` = ',` < ',` = ')} the given recursion behaviour. Then = [012] is a type (1) termination order on E and = [120] is of both types.