# On the logical structure of choice and bar induction principles

We develop an approach to choice principles and their contrapositive bar-induction principles as extensionality schemes connecting an "intensional" or "effective" view of respectively ill-and well-foundedness properties to an "extensional" or "ideal" view of these properties. After classifying and analysing the relations between different intensional definitions of ill-foundedness and well-foundedness, we introduce, for a domain A, a codomain B and a "filter" T on finite approximations of functions from A to B, a generalised form GDC_A,B,T of the axiom of dependent choice and dually a generalised bar induction principle GBI_A,B,T such that: GDC_A,B,T intuitionistically captures the strength of ∙ the general axiom of choice expressed as ∀ a∃ b R(a, b) ⇒∃α∀α R(α,α(a)) when T is a filter that derives point-wise from a relation R on A × B without introducing further constraints, ∙ the Boolean Prime Filter Theorem / Ultrafilter Theorem if B is the two-element set 𝔹 (for a constructive definition of prime filter), ∙ the axiom of dependent choice if A = ℕ, ∙ Weak König's Lemma if A = ℕ and B = 𝔹 (up to weak classical reasoning) GBI_A,B,T intuitionistically captures the strength of ∙ Gödel's completeness theorem in the form validity implies provability for entailment relations if B = 𝔹, ∙ bar induction when A = ℕ, ∙ the Weak Fan Theorem when A = ℕ and B = 𝔹. Contrastingly, even though GDC_A,B,T and GBI_A,B,T smoothly capture several variants of choice and bar induction, some instances are inconsistent, e.g. when A is 𝔹^ℕ and B is ℕ.

## Authors

• 3 publications
• 2 publications
• ### Induction rules in bounded arithmetic

We study variants of Buss's theories of bounded arithmetic axiomatized b...
09/27/2018 ∙ by Emil Jeřábek, et al. ∙ 0

• ### Church's thesis and related axioms in Coq's type theory

"Church's thesis" (𝖢𝖳) as an axiom in constructive logic states that eve...
09/01/2020 ∙ by Yannick Forster, et al. ∙ 0

• ### The Spaces of Data, Information, and Knowledge

We study the data space D of any given data set X and explain how functi...
11/06/2014 ∙ by Xiaoyu Chen, et al. ∙ 0

• ### Principles of Solomonoff Induction and AIXI

We identify principles characterizing Solomonoff Induction by demands on...
11/25/2011 ∙ by Peter Sunehag, et al. ∙ 0

• ### A Novice-Friendly Induction Tactic for Lean

In theorem provers based on dependent type theory such as Coq and Lean, ...
12/16/2020 ∙ by Jannis Limperg, et al. ∙ 0

• ### Induction Models on ℕ

Mathematical induction is a fundamental tool in computer science and mat...
08/14/2020 ∙ by A. Dileep, et al. ∙ 0

• ### Abstraction Super-structuring Normal Forms: Towards a Theory of Structural Induction

Induction is the process by which we obtain predictive laws or theories ...
07/03/2011 ∙ by Adrian Silvescu, et al. ∙ 0

##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## I Introduction

### I-a Bar induction, dependent choice and their variants as extensionality principles

For a domain , there are different ways to define a well-founded tree branching over . A first possibility is to define it as an inductive object built from leaves and from nodes associating a subtree to each element in . We will call this definition intensional. Using a syntax familiar to functional programming languages or Martin-Löf-style type theory, such intensional trees correspond to inhabitants of an inductive type:

type tree = | of (A tree)

A second possibility is a definition which we shall call extensional

and which is probably more standard in the context of non type-theoretic mathematics. Let

denote the set of finite sequences of elements of , with denoting the empty sequence and the extension of the sequence with from . Then an extensional tree is a downwards-closed predicate over . Finite sequences are interpreted as finite paths from the root of a tree and the predicate determines which paths are contained in . We say that is extensionally well-founded if for all infinite paths in , the path eventually “leaves” the tree, i.e. there is an initial finite prefix of such that (as path from the root) is not contained in .

The intensional definition is stronger: to any inductively-defined tree , we can associate an extensionally well-founded tree by recursion on as follows:

where , a particular case of concatenation , prefixes with . We can then prove by induction on that , where is the restriction of to its first values.

To reflect that is related to , we can define a realisability relation between and as follows:

• realises if

• realises if and for all , realises

Then, we can prove by induction on that realises .

Bar induction, introduced by Brouwer and further analysed e.g. by Kleene and Vesley [21] can be seen as the converse property, namely that any extensionally well-founded can be turned into an inductively-defined tree that realises , so that, at the end, the intensional and extensional definitions of well-foundedness are equivalent111Kleene and Vesley [21] used respectively the terms “inductive” and “explicit” for what we call intensional and extensional..

At its core, bar induction is the statement “ barred implies inductively barred” for a predicate on . As studied e.g. in Howard and Kreisel [16], when used on a negated predicate , this reduces to “ extensionally well-founded implies inductively well-founded”, where inductively well-founded abbreviates “ inductively well-founded at ”, where inductively well-founded at is itself defined by the following clauses:

• if then is inductively well-founded at

• if, for all , is inductively well-founded at , then is inductively well-founded at

Then, it can be proved that inductively well-founded at  is itself not different from the existence of an intensional tree (hidden in the structure of any proof of inductive well-foundedness) such that realises . This justifies our claim that bar induction is at the end a way to produce an intensionally well-founded tree from an extensionally well-founded one.

Now, if bar induction can be considered as an extensionality principle, it should be the same for its contrapositive which is logically equivalent to the axiom of dependent choice. This means that it should eventually be possible to rephrase the axiom of dependent choice as a principle asserting that, if a tree is coinductively ill-founded, then it is extensionally ill-founded (i.e. an infinite branch can be found). We will investigate this direction in Section II, together with precise relations between these principles and their restriction on finitely-branching trees, namely Kőnig’s Lemma222The spelling König’s Lemma is also common. We respect here the original Hungarian spelling of the author’s name. and the Fan Theorem, introducing a systematic terminology to characterise and compare these different variants.

Note in passing that the approach to consider bar induction and choice principles as extensional principles is consistent with the methodology developed e.g. by Coquand and Lombardi: to avoid the necessity of choice or bar induction axioms, mathematical theorems are restated using the (co-)inductively-defined notions of well- and ill-foundedness rather than the extensional notions [9, 10].

### I-B Weak Kőnig’s Lemma at the intersection of Boolean Prime Filter Theorem and Dependent Choice

We know from classical reverse mathematics of the subsystems of second order arithmetic [29] that the binary form of Kőnig’s lemma, namely Weak Kőnig’s Lemma (WKL) has the strength of Gödel’s completeness theorem (for a countable language). Classical reverse mathematics of the axiom of choice and its variants in set theory [14, 27, 20, 11] also tells that Gödel’s completeness theorem has the strength of the Boolean Prime Filter Theorem (for a language of arbitrary cardinal). This suggests that the Boolean Prime Filter Theorem is the “natural” generalisation of WKL from countable to arbitrary cardinals.

On the other side, Weak Kőnig’s Lemma is a consequence333Note that Kőnig’s Lemma is a theorem of set theory and that we need to place ourselves in a sufficiently weak metatheory, e.g. , to state this result. of the axiom of Dependent Choice, the same way as its contrapositive, the Weak Fan Theorem, is an instance of Bar Induction, itself related to the contrapositive of the axiom of Dependent Choice. This suggests that there is common principle which subsumes both the Axiom of Dependent Choice and the Boolean Prime Filter Theorem with Weak Kőnig’s Lemma at their intersection.

Such a principle is stated in Section III where it is shown that the ill-founded version indeed generalises the axiom of Dependent Choice and the well-founded version generalises Bar Induction. In the same section, we also show that one of the instance of the ill-founded version captures the general Axiom of Choice, but that, in its full generality, the new principle is actually inconsistent.

Section IV is devoted to show that the Boolean Prime Filter Theorem is an instance of the generalised axiom of Dependent Choice. In particular, this highlights that the notions of ideal and filter generalise the notion of a binary tree where the prefix order between paths of the tree is replaced by an inclusion order between non-sequentially-ordered paths now seen as finite approximations of a function from to the two-element set .

### I-C Methodology and summary

For our investigations to apply both to classical and to intuitionistic mathematics, we carefully distinguish between the choice axioms (seen as ill-foundedness extensionality schemes) and bar induction schemes (seen as well-foundedness extensionality schemes).

All in all, the correspondences we obtain are summarised in Table I where the definitions of the different notions can be found in the respective sections of the paper.

## Ii The logical structure of dependent choice and bar induction principles

### Ii-a Metatheory

We place ourselves in a metatheory capable to express arithmetic statements. In addition to the type of natural numbers together with induction and recursion, we assume the following constructions to be available:

• The type of Boolean values and together with a mechanism of definition by case analysis. It shall be convenient to allow the definition of propositions by case analysis as in , whose logical meaning shall be equivalent to .

• For any type , the type of finite sequences over  whose elements shall generally be ranged over by the letters , … We write for the empty sequence and for the extension of sequence with element . We write for the length of and for the element of when . We write for the concatenation of and . We write to mean that is an initial prefix of . This is inductively defined by:

 \shortstack$$\hrulefillu≤su\shortstacku≤sv\hrulefillu≤sv⋆a We shall also support case analysis over finite sequences under the form of a operator. • For any two types and , the type of functions from to . Functions can be built by -abstraction as in for in and in and used by application as in for in and in . To get closer to the traditional notations, we shall also abbreviate into . • A type reifying the propositions as a type. The type shall then represent the type of predicates over . We shall allow predicates to be defined inductively (smallest fixpoint) or coinductively (greatest fixpoint), using respectively the and notations. • For any type and predicate over , the subset of elements of satisfying . This is a language for higher-order arithmetic but in practice, we shall need quantification just over functions and predicates of (apparent) rank 1 (i.e. of the form or with no arrow types in and the ). We however also allow arbitrary type constants to occur, so we can think of our effective metatheory as a second-order arithmetic generic over arbitrary more complex types. In practise, our metatheory could typically be the image of arithmetic in set theory or in an impredicative type theory. We will in any case use the notation to mean that has type when is a type, which, if in set theory, will become belongs to the set . The metatheory can be thought as classical, i.e. associated to a classical reading of connectives but in practice, unless stated otherwise, most statements will have proofs compatible with a linear, intuitionistic or co-intuitionistic reading of connectives too. Using linear logic as a reference for the semantics of connectives [13], , , , , have respectively to be read linearly as , , , and the logical dual of , while has to be read when used as the dual of and when used as the dual of . An intuitionistic reading will add a “!” (of-course connective of linear logic) in front of negative connectives while a co-intuitionistic reading will add a “?” (why-not connective of linear logic) in front of positive connectives. ### Ii-B Infinite sequences We write for the infinite (countable) sequences of elements of . There are different ways to represent such an infinite sequence: • We can represent it as a function, i.e. as a functional object of type . • We can represent it as a total functional relation, i.e. as a relation of type such that . • Additionally, when is , an extra possible representation is as a predicate over with intended meaning if holds and if holds (and unknown meaning otherwise). The representation as a functional relation is weaker in the sense that a function induces a functional relation but the converse requires the axiom of unique choice. In the sequel, we will use the notation and to mean different things depending on the representation chosen for . In the first case, means where is the equality on . Similarly, defines the function . In the second case, however means and defines the functional relation where can occur in . When is , the representation as a predicate is even weaker in the sense that a functional relation induces a predicate but the converse requires classical reasoning. We can easily turn a predicate into a relation but proving requires a call to excluded-middle on . When is and is a predicate, we define as and as . Technically, this means seeing as a notation for “”. Similarly, defines . In particular, this means that all choice and bar induction statements of this paper have two readings of a different logical strength (depending on the validity of the axiom of unique choice in the metatheory), or even three readings (depending on the validity of the axiom of unique choice and of classical reasoning) when the codomain of the function mentioned in the theorems is . If , we write to mean that is an initial prefix of . This is defined inductively by the following clauses:  \shortstack$$\hrulefill$⟨⟩≺sα$\shortstack$u≺sαα(|u|)==a$\hrulefill$u⋆a≺sα$

If and , we write for the sequence defined by and .

We have the following easy property:

If then .

### Ii-C Trees and monotone predicates

Let be a type and be a predicate on . We overload the notation to mean that holds on . We say that is finitely-branching if is in bijection with a non-empty bounded subset of (i.e. to for some ).

We say that is a tree if it is closed under restriction, and, dually, that is monotone if it is closed under extension (the formal definitions are given in Table II). Classically, we have monotone iff is a tree, and, dually, monotone iff is a tree. In particular, another way to describe a tree is as an antimonotone predicate444From a categorical perspective, a tree is a contravariantly functorial predicate over the preorder generated by , while a monotone predicate is covariantly functorial.. It is convenient for the underlying intuition to restrict oneself to predicates which are trees, or which are monotone, even if it does not always matter in practice. When it matters, a predicate is turned into a tree either by discarding sequences not connected to the root or by completing it with missing sequences from the root: these are respectively the downwards arborification and upwards arborification of a predicate, as shown in Table III. We dually write and for the upwards monotonisation and downwards monotonisation of . Arborification and monotonisation are idempotent. We shall in general look for minimal definitions of the concept involved in the paper, and thus consider arbitrary predicates as much as possible, turning them into trees or monotone predicates only when needed to give sense to the definitions.

### Ii-D Well-foundedness and ill-foundedness properties

We list properties on predicates which are relevant for stating ill-foundedness axioms (i.e. choice axioms), and their dual well-foundedness axioms (i.e. bar induction axioms). Duality can be understood both under a classical or linear interpretation of the connectives, where the predicate in one column is supposed to be dual of the predicate occurring in the other column (dual predicates if in linear logic, negated predicates if in classical logic). Table IV details properties which differ by contraposition and are thus logically equivalent (in classical and linear logic). On the other side, tables V and VI detail properties which are logically opposite.

We indicated with (*) concepts for which we did not find an existing terminology in the literature. Thus, the terminology is ours. Also, what we called staged infinite is often simply called infinite. We used staged infinite to make explicit the difference from a definition based on the presence of an infinite number of nodes. Thereby we also obtain a symmetry with the notion of staged barred. What we call having an infinite branch could alternatively be called ill-founded, or having a choice function. In particular, the terminology having an infinite branch applies here to any predicate and is not restricted to trees. Note that well-founded in the standard meaning is the same as barred for the dual predicate. In particular, when opposing ill-foundedness and well-foundedness, we adopt a bias towards the tree view, i.e. towards the left column.

We have the following:

###### Proposition 2

If is a tree, then having unbounded paths is equivalent to being staged infinite. Dually, if is monotone, being a uniform bar is equivalent to being staged barred.

Proof:  Because trees and monotone predicates are invariant under arborification and monotonisation.

As a consequence, it is common to use the notion of staged infinite, which is simpler to formulate, when we know that is a tree. Otherwise, if is an arbitrary predicate which is not necessarily a tree, there is no particular interest in using the notion of staged infinite. Similarly, staged barred is a simpler way to state uniformly barred when is monotone, i.e., conversely, uniform bar is the expected refinement of staged barred when is not known to be monotone.

A progressing may be productive at without being productive at all , so we may need to prune to extract from it a spread. Dually, not all barricaded predicates are inductive bars at all but we can saturate them into inductive bars, by taking the hereditary closure. We make this formal in the following proposition:

###### Proposition 3

If is productive then its pruning is a spread. Dually, if is barricaded then its hereditary closure is an inductive bar.

Proof:  That is in the pruning of is direct from productive. That the pruning of is progressing on all is also direct by construction of the pruning. The other part of the statement is by duality.

Conversely, by coinduction, the pruning of any progressing predicate contains and dually, induction shows that the hereditary closure of an hereditary predicate is included in . Thus, we have:

###### Proposition 4

We can then relate productive and spread, as well as inductive bar and barricaded as follows:

###### Proposition 5

is productive iff there exists which is a spread. Dually, is an inductive bar iff all is barricaded.

Proof:  By duality, it is enough to prove the first equivalence. From left to right, we use Prop. 3, observing that the pruning of is included in . From right to left, a spread is productive and a coinduction suffices to prove that inclusion preserves productivity.

On the other side, having unbounded paths is equivalent to being a spread or to being productive only when is finitely-branching. Similarly for being uniformly barred compared to being an inductive bar or being barricaded. Moreover, none of the equivalences hold linearly. The second one requires intuitionistic logic, i.e. requires the ability to use an hypothesis several times while the first one, dually, requires a bit of classical reasoning555or, to be more precise, co-intuitionistic reasoning, that is, using a multi-conclusion sequent calculus to formulate the reasoning, with the contraction rule allowed on conclusions but not on hypotheses.

For being a class of formulae and and ranging over , let be the principle . Dually, let be .

###### Proposition 6

If is non-empty finite, then productive is equivalent to having unbounded paths and being an inductive bar is equivalent to uniformly barred. The first statement holds in a logic where holds and the second in a logic where holds, for a class of formulae containing arithmetical existential quantification over .

Proof:  Relying on duality, we only prove the first statement. Based on our definition of finite, we also assume without loss of generality that is . Our proof relies on an argument found in [3, 18] and proceeds by proving more generally that is productive from iff has unbounded paths from .

From left to right, we reason by induction on . If is this is direct from productive by defining . Otherwise, by productive from , we get such that is productive from , obtaining by induction of length such that , showing that is the expected sequence of length .

From right to left, we reason coinductively. To prove that , we take a path of length . Then, in order to apply the coinduction hypothesis and prove the coinductive part, we prove that there is such that has unbounded paths from . By , it is enough to prove that for all and , there is a path of length and a path of length such that either or is in . So, let and be given lengths. By unbounded paths from , we get a sequence of length such that . This is a non-empty sequence, hence a sequence of the form so that we have either or for of length . By closure of , prefixes of length and of length of can be extracted which both are in .

Remark: Based on the decomposition of WKL for decidable trees into a choice principle and the Lesser Limited Principle of Omniscience (LLPO), we suspect that we actually have the stronger result that the equivalence of unbounded paths and productivity implies for the corresponding underlying class of formulae , and similarly with and the dual statement.

### Ii-E Bar induction and tree-based dependent choice

In the first part of Table VII, we reformulate using our definitions the standard statement of bar induction and a tree-based formulation of dependent choice from the literature. The standard form of Bar Induction, as e.g. in [21], corresponds in our classification to , apart from the fact that we do not fix in advance the logical complexity of – such as being countable or not – or the arithmetic strength of – i.e. whether it is decidable, or recursively enumerable, etc. For dependent choice666or dependent choices for some authors, e.g. [20], we consider here a pruned-tree-based definition corresponding to the instance of Levy’s family of Dependent Choice indexed on cardinals [23]777Alternatively, it can be seen as the generalisation to arbitrary codomains of the Boolean dependent choice principle described e.g. in Ishihara [18].. A comparison with other logically equivalent definitions of dependent choice will be given in Section II-H.

These formulations of Tree-based Dependent Choice and Bar Induction are not dual888This might be related to coinductive reasoning historically coming later and being less common than inductive reasoning in mathematics. of each other but Prop. 5 gives us a way to connect each one with the dual of the other:

###### Theorem 1

As schemes, generalised over , and are equivalent, and so are and

### Ii-F Kőnig’s Lemma and the Fan Theorem

The second part of Table VII is about Kőnig’s Lemma and the Fan Theorem.

The Fan Theorem is sometimes stated over finitely-branching trees, where the definition of finite itself may vary [21, 18], but it is also sometimes considered by default to be on a binary tree [2, 4, 3, 7, 9, 19] in which case the finite version is sometimes called extended. We call here Fan Theorem the finite version, for finite defined as being in bijection with a finite prefix of , and for all branchings being on the same finite . The statement of the Fan Theorem sometimes relies on the notion of inductive bar (e.g. [9]), what we call here , or on the definition of staged barred for monotone predicates (as a variant in [19]), called here , or on the dual notions of finite tree (i.e., technically of staged barred for the negation of a tree) and well-founded tree (i.e., technically of inductively barred for the negation of a tree) in e.g. [5], which respectively corresponds to and for the complement of . But it also often relies on the definition of uniform bar [2, 3, 4, 7, 18, 19, 21] over an arbitrary predicate, what we call here . Note that, as in the case of bar induction, we omit the usual restriction of the statement of the Fan Theorem to decidable predicates.

Kőnig’s Lemma is generally stated as infinite tree implies has an infinite branch, but the definition of infinite may differ from author to author. The definition in [5, 18] expresses explicitly that the infinity can only be in depth. It does so by requiring arbitrary long branches rather than an infinite number of nodes. The exact definition of arbitrarily long branches also depends on authors. For instance, [30] relies (up to classical reasoning) on having unbounded paths for arbitrary predicates rather than trees, what we call here , but most of the time it is about what we call staged infinite tree [3, 18, 19], leading formally to the definition . The versions and imply LLPO [17]. Contrastingly, the versions which we call and are “pure choice” versions not implying LLPO (see Prop. 6 for the connection). The binary variant of the former occurs for instance in the literature with name  [3].

There is a standard way to go from arbitrary predicates to trees or monotone predicates by associating to each predicate its (downward or upwards) tree or monotone closure. This allows to show that it is equivalent to state Kőnig’s Lemma on trees using staged-infinity or on arbitrary predicates using unbounded paths, and, similarly, that it is equivalent to state the Fan Theorem on monotone predicates using staged barred () or on arbitrary predicates using uniformly barred.

###### Proposition 7

As schemes, when generalised over , is equivalent to and to .

Proof:  We treat the first equivalence. From left to right, if is a predicate, we apply to . The resulting infinite branch is an infinite branch in because . From right to left, the statement holds by Prop. 2. The second equivalence is by duality.

### Ii-G Choice and bar induction as relating intensional and extensional concepts

The intensional definitions are stronger than the extensional ones, which implies that the choice and bar induction axioms can alternatively be seen as stating the logical equivalence of the intensional and extensional versions of ill-foundedness and well-foundedness properties (of various strengths).

###### Theorem 2

inductively barred implies barred. Dually, has an infinite branch implies is productive.

Proof:  We prove by induction on the definition of inductively barred that inductively barred at implies barred from where the latter requires that for all , there is such that .

If , then it is enough to take for to get for any . If is barred from for all , this means that there is such that for any . For a given , set and so that we can find , hence , i.e. (by Prop. 1) together with .

The dual proof builds productive at from has an infinite branch from by coinduction. From the infinite branch from and we get , i.e. . It remains to find such that is productive from and it suffices to take since has an infinite branch from simply because implies (by Prop. 1) and from .

### Ii-H Relation to other formulations of Dependent Choice and to countable Zorn’s Lemma

For a relation on , it is common to formulate dependent choice as

 ∀bB∃b′BR(b,b′)⇒∀b0B∃fN→B(f(0)=b0∧∀nR(f(n),f(n+1))).

Let us call serial a (homogeneous) relation such that holds. In this section, we formally compare the resulting statement of dependent choice to , examining also dual statements.

Let be a serial relation, i.e. a relation such that . Using a seed , each such relation can be turned into a predicate on under the two following ways:

• The chaining from is probably the most natural one: it says that if all steps in from are in .

• The alignment from artificially uses non-empty sequences to represent pairs of elements. We have either when has at least two elements and the last two elements are related by , or, when the sequence contains exactly one element which is related to , or, finally, when the sequence is simply empty.

Reasoning by induction on in one direction and on in the other direction, we can show that both are related:

###### Proposition 8

iff

Dually, we can define antichaining and blockings such that:

###### Proposition 9

iff

The formal definitions are given in Table VIII, where we can notice that the use of vs.  does not matter in practice since the structure of the relation is a function of .

We are now in position to state in Table IX a relatively standard form of Dependent Choice which we call for being a relation on and a seed in . Though to our knowledge uncommon in the literature, we also mention its dual which we call .

We state a few results that allow to show the equivalence of and as schemes.

We have the following properties.

###### Proposition 10

serial implies productive for any . Dually, if is inductively barred then has a least element.

Proof:  We prove by coinduction that implies productive from . If is empty, holds by definition and there is by seriality a such that . This allows to conclude by coinduction hypothesis. If has the form , there is also by seriality a such that and we can again conclude by coinduction hypothesis. The productivity of finally follows because holds by definition. The dual statement is by dual (inductive) reasoning.

Conversely, for a predicate, let be defined by and let be the relation on defined by . The relation is serial by construction: for such that is productive from , there is such that is productive from and . Also, as soon as is productive.

We can now formally state the correspondence in our language:

As schemes, and