# Higher-order Cons-free Interpreters

Constructor rewriting systems are said to be cons-free if any constructor term occurring in the rhs of a rule must be a subterm of the lhs of the rule. Roughly, such systems cannot build new data structures during their evaluation. In earlier work by several authors, (typed) cons-free systems have been used to characterise complexity classes such as polynomial or exponential time or space by varying the type orders, and the recursion forms allowed. This paper concerns the construction of interpreters for cons-free term rewriting. Due to their connection with proofs by diagonalisation, interpreters may be of use when studying separation results between complexity classes in implicit computational complexity theory. We are interested in interpreters of type order k > 1 that can interpret any term of strictly lower type order; while this gives us a well-known separation result E^kTIME ⊆ E^k+1TIME, the hope is that more refined interpreters with syntactically limited constraints can be used to obtain a notion of faux diagonalisation and be used to attack open problems in complexity theory.

## Authors

• 10 publications
• 22 publications
• ### Hyperpfaffians and Geometric Complexity Theory

The hyperpfaffian polynomial was introduced by Barvinok in 1995 as a nat...
12/19/2019 ∙ by Christian Ikenmeyer, et al. ∙ 0

• ### On First-order Cons-free Term Rewriting and PTIME

In this paper, we prove that (first-order) cons-free term rewriting with...
11/09/2017 ∙ by Cynthia Kop, et al. ∙ 0

• ### Lambda Congruences and Extensionality

In this work we provide alternative formulations of the concepts of lamb...
03/15/2019 ∙ by Michele Basaldella, et al. ∙ 0

• ### Tuple Interpretations for Higher-Order Rewriting

We develop a class of algebraic interpretations for many-sorted and high...
05/03/2021 ∙ by Deivid Vale, et al. ∙ 0

• ### Free Higher Groups in Homotopy Type Theory

Given a type A in homotopy type theory (HoTT), we can define the free in...
05/05/2018 ∙ by Nicolai Kraus, et al. ∙ 0

• ### Implicit complexity via structure transformation

Implicit computational complexity, which aims at characterizing complexi...
02/09/2018 ∙ by Daniel Leivant, et al. ∙ 0

• ### Separation of PSPACE and EXP

This article shows that PSPACE not equal EXP. A simple but novel proof t...
04/28/2021 ∙ by Reiner Czerwinski, et al. ∙ 0

##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1. Introduction

In [2], Jones introduced cons-free programming; roughly read-only programs where data structures cannot be created or altered, only read from the input. For example, cons-free programs with data order can compute exactly those decision problems which are in PTIME, while tail-recursive cons-free programs with data order characterise PSPACE. The field of research studying such characterisations is called implicit computational complexity (ICC).

Jones’ results can easily be generalised to the area of (higher-order) term rewriting. In term rewriting, systems have no fixed evaluation order (so call-by-name or call-by-value can be introduced as needed, but are not required), and reduction is natively non-deterministic. ICC using cons-free term rewriting has been studied by several authors [1, 3].

One important goal of ICC is to provide new separation results between well-known classes. A standard technique for doing so is diagonalisation via interpreters. Roughly, an interpreter for a rewriting system is a term in another rewriting system that, when applied to (a bit string representation of) any term from , will simulate evaluation of that term. It is tantalising to consider interpreters for, and written in, cons-free term rewriting. While interpreters are very well-known in functional programming, they are very rarely seen in rewriting. Furthermore, typical programming of self-interpreters in the wild involves maintaining several intermediate data structures, which is patently impossible in cons-free rewriting where no data constructors are present.

Our work concerns the construction of cons-free interpreters for higher-order term rewriting. As a proof of concept, we consider in this paper an interpreter of type order that will evaluate any, suitably encoded, term of type order . In future work, we hope that studying further constraints to the syntactic form of the rules of higher-order systems (effectively, constraining the types of recursion used) may lead to more refined diagonalisation and that such “faux” diagonalisation may lead to results separating known complexity classes.

## 2. Preliminaries

We consider higher-order term rewriting with simple types, and -reduction as a separate step; we reason modulo -conversion only. Function symbols are assigned a type declaration of the form , where does not need to be a base type. Rules are assumed to have the form (so can have a functional type). we additionally limit interest to higher-order constructor TRSs, which is to say that each in the rule above is a constructor term, and does not contain either applications or abstractions.

We will use “data terms” to refer to the set of ground constructor terms containing neither abstractions nor applications. We denote for the set of terms built from symbols in and variables in , and for the set of data terms. We will particularly consider an innermost weak reduction strategy, which disallows reductions below an abstraction, and allows a subterm to be reduced only if all are abstractions or normal forms.

###### Definition 1.

Let be a class of HOTRSs and be sets indexed by (shortly denoted and instead of and ) such that . A HOTRS with start symbol is an interpreter for with input set and output set , if there exist computable injective functions and such that, for all and :

if and only if

## 3. Cons-free Term Rewriting

Like Jones [2], we limit interest to cons-free rules, adapted to term rewriting as follows:

###### Definition 2 (Cons-free Rules).

A rule , presented using -conversion in a form where all binders are distinct from the free variables, is cons-free if for all subterms of with a constructor, either is a subterm of or, if not, is a data term. A left-linear (higher-order) constructor TRS is cons-free if all rules in are.

Cons-free term rewriting enjoys many convenient properties. Most importantly, the set of data terms that may be reduced to using cons-free rules is limited by the data terms in the start term and the right-hand sides of rules, as described by the following definition:

###### Definition 3.

For a given ground term , the set contains all data terms which occur as (a) a subterm of or (b) a subterm of the right-hand side of some rule in .

is a set of data terms, is closed under subterms and, since is fixed, has a linear number of elements in the size of . The property that no new data is generated by reducing is formally expressed by the following result:

###### Definition 4 (B-safety).

Let be a set which (a) is closed under subterms, and (b) contains all data terms occurring in the right-hand side of a rule in . A term is -safe if for all subterms of : if with a constructor, then .

###### Lemma 5.

Let be cons-free. For all : if is -safe and , then is -safe.

Thus, for a decision problem (where and all are data terms), all terms occurring in the reduction are -safe. This insight allows us to limit interest to -safe terms in most cases, and is instrumental to obtain the following results:

###### Proposition 1.

The class of decision problems in contains exactly those functions which can be accepted by:

• a cons-free HOTRS of order with a weak-innermost reduction strategy;

• a cons-free confluent HOTRS of order with a weak-innermost reduction strategy.

That is, adding determinism does not make a difference to the characterisation result. Hence for simplicity we will focus on confluent TRSs in particular.

## 4. Interpretations

We consider a system with the following constructors:

 0:[bitstring]⇒bitstringVar:[bitstring]⇒term1:[bitstring]⇒bitstringFun:[bitstring×termlist]⇒term⊳:bitstring⊥:term[]:termlist∅:rules:::[term×termlist]⇒termlistRule:[term×term×rules]⇒rules

Bit strings can be used to encode numbers in the usual way, e.g. ; we assume a unique encoding for each bitstring, so without leading zeros. To encode a first-order TRS , we enumerate the function symbols, writing , and in each rule we assume the variables are in for some . For the term encoding, let:

 [xi]F=Var(¯i)[fi(s1,…,sn)]F=Fun(¯i,[s1]F :: … :: [sn]F :: [])

Here, the list constructor is denoted in an infix, right-associative way. Writing , the TRS is encoded as the following term:

 Rule([ℓ1],[r1],Rule([ℓ2],[r2],Rule(…,Rule([ℓm],[rm],∅)…)))

Of course, strictly speaking an intepreter should operate on bit strings only. We have chosen for this more verbose encoding because it is easier to understand the resulting interpreter-system, and the same ideas can be transferred to a more restrictive encoding.

We seek to define a confluent cons-free HOTRS with a weak-innermost reduction strategy which, given a confluent, cons-free first-order TRS with innermost reduction, and a data term , obtains the encoding for , provided this is a data term. Formally, if is encoded as the term , we must have for any ground start term with a data term, and if is not a data term. This can be used to determine whether (but is more general).

Note that, since must be cons-free, we cannot represent intermediate terms (e.g. the direct reduct of the start term). Instead, will operate on tuples (,), where is a “substitution”: a term of type mapping the representations of variables in to data terms or (which indicates any term in normal form that is not a data term).

To work! We will recurse over the set of rules, but carry along the complete set, as well as the arguments to the start term (which together define the set ), for later use.

 normalform(R,Fun(f,args))→normalise(Fun(f,args),λx.⊥,R,args)normalise(Var(x),γ,R,bs)→γ⋅xnormalise(Fun(f,args),γ,R,bs)→findrule(Fun(f,args),γ,R,R,bs)findrule(w,γ,∅,R,bs)→substitute(w,γ,bs,R)findrule(w,γ,Rule(ℓ,r,tl),R,bs)→test(match(w,γ,ℓ,λx.⊥,R,bs),w,γ,ℓ,r,tl,R,bs)test(δ,w,γ,ℓ,r,tl,R,bs)→test2(δ⋅⊳,δ,w,γ,ℓ,r,tl,R,bs)test2(⊥,δ,w,γ,ℓ,r,tl,R,bs)→normalise(r,δ,R,bs)test2(Var(⊳),δ,w,γ,ℓ,r,tl,R,bs)→findrule(w,γ,tl,R,bs)

Thus, we normalise just by substituting if is a variable or no rules match (so is in normal form either way); reduces to if is not a data term. The function is used to test whether a rule matches and find the relevant substitution in one go: in case of a match, reduces to for every which does not refer to a variable in , and in case of no match, it reduces to instead (which is not a representation of any term). In the case of a successful match, we continue to normalise .

To define , we note that is always a subterm of either the start term, or the right-hand side of a rule, and is not a variable; the result of substituting is a normal form, so we must reduce to a data term—which, by Lemma 5, is a subterm of or of the right-hand side of some rule—or to . These observations give the following rules:

 eqbits(⊳,⊳)→trueeqbits(⊳,b––(ys))→false  ⟦for b––∈{0,1}⟧eqbits(a––(xs),⊳)→false  ⟦for a––∈{0,1}⟧eqbits(a––(xs),a––(ys))→eqbits(xs,ys)  ⟦for a––∈{0,1}⟧eqbits(a––(xs),b––(ys))→false  ⟦for a––,b––∈{0,1}∧a––≠b––⟧
 eqsubst(Var(x),γ,t)→eqsubst(γ⋅x,λy.⊥,t)eqsubst(⊥,γ,t)→falseeqsubst(Fun(f,as),γ,Var(y))→falseeqsubst(Fun(f,as),γ,Fun(g,bs))→eqcheck(eqbits(f,g),as,γ,bs)eqcheck(false,as,γ,bs)→falseeqcheck(true,[],γ,[])→trueeqcheck(true,s::ss,γ,t::ts)→eqcheck(eqsubst(s,γ,t),ss,γ,ts)
 substitute(w,γ,bs,R)→substcheckbs(subst(w,γ,bs),w,γ,R)subst(w,γ,[])→⊥subst(w,γ,b::bs)→subst2(eqsubst(w,γ,b),b,w,γ,bs)subst2(true,b,w,γ,bs)→bsubst2(false,Fun(f,as),w,γ,bs)→subst3(subst(w,γ,as),w,γ,bs)subst3(⊥,w,γ,bs)→subst(w,γ,bs)subst3(Fun(f,as),w,γ,bs)→Fun(f,as)substcheckbs(Fun(f,as),w,γ,R)→Fun(f,as)substcheckbs(⊥,w,γ,R)→…

Thus, to obtain , we find the subterm of the start term or the rules equal to it. The rules for the latter case—verifying that is a constructor-term and finding a subterm of the right-hand side of a rule equal to , or reducing to if either part fails—have been omitted for space reasons.

The next task is to find whether instantiates some left-hand side , and obtain the relevant substitution if so. Here, “instantiates” should not be taken literally as is not necessarily a basic term. Rather, writing , we seek to confirm whether for some . Note that is necessarily left-linear, and its strict subterms are constructor terms.

 match(Fun(f,ss),γ,Fun(g,ts),δ,R,bs)→matchcheck(eqbits(f,g),ss,γ,ts,δ,R,bs)matchcheck(false,ss,γ,ts,δ,R,bs)→λx.Var(⊳)matchcheck(true,ss,γ,ts,δ,R,bs)→matchall(ss,γ,ts,δ,R,bs)matchall([],γ,[],δ,R,bs)→δmatchall(s::ss,γ,t::ts,δ,R,bs)→matchall(ss,γ,ts,instantiate(normalise(s,γ,R,bs),t,δ),R,bs)

Thus, iterates over both and , updating for each such that . This uses the helper function, which assumes that is a data term or . If any of the instantiations fails (so if the rule does not match), is updated to have for all lists which do not correspond to a variable in .

## 5. Conclusions

Thus we have seen:

There is a second-order cons-free confluent HOTRS

which is an interpreter for the class of first-order cons-free confluent TRSs.

Assuming Proposition 1 holds, we could now use a diagonalisation argument to obtain a new proof for the hierarchy result . Generalising the program to higher orders, we should also be able to obtain that each .

While these are known results, the ideas used in this proof might transfer to more ambitious projects. For example, if we could give a tail recursive variation of the interpreter then, following Jones [2], we might obtain a proof of the proposition .

## References

• [1] D. de Carvalho and J. Simonsen. An implicit characterization of the polynomial-time decidable sets by cons-free rewriting. In G. Dowek, editor, RTA-TLCA ’14, volume 8560 of LNCS, pages 179–193, 2014.
• [2] N. Jones. Life without cons. JFP, 11(1):5–94, 2001.
• [3] C. Kop and J. Simonsen. Complexity hierarchies and higher-order cons-free rewriting. In D. Kesner and B. Pientka, editors, FSCD ’16, volume 52 of LIPIcs, pages 23:1–23:18, 2016.