# Complexity of Non-Monotonic Logics

Over the past few decades, non-monotonic reasoning has developed to be one of the most important topics in computational logic and artificial intelligence. Different ways to introduce non-monotonic aspects to classical logic have been considered, e.g., extension with default rules, extension with modal belief operators, or modification of the semantics. In this survey we consider a logical formalism from each of the above possibilities, namely Reiter's default logic, Moore's autoepistemic logic and McCarthy's circumscription. Additionally, we consider abduction, where one is not interested in inferences from a given knowledge base but in computing possible explanations for an observation with respect to a given knowledge base. Complexity results for different reasoning tasks for propositional variants of these logics have been studied already in the nineties. In recent years, however, a renewed interest in complexity issues can be observed. One current focal approach is to consider parameterized problems and identify reasonable parameters that allow for FPT algorithms. In another approach, the emphasis lies on identifying fragments, i.e., restriction of the logical language, that allow more efficient algorithms for the most important reasoning tasks. In this survey we focus on this second aspect. We describe complexity results for fragments of logical languages obtained by either restricting the allowed set of operators (e.g., forbidding negations one might consider only monotone formulae) or by considering only formulae in conjunctive normal form but with generalized clause types. The algorithmic problems we consider are suitable variants of satisfiability and implication in each of the logics, but also counting problems, where one is not only interested in the existence of certain objects (e.g., models of a formula) but asks for their number.

## Authors

• 3 publications
• 6 publications
06/10/2017

### Towards Statistical Reasoning in Description Logics over Finite Domains (Full Version)

We present a probabilistic extension of the description logic ALC for re...
03/27/2013

### Uncertainty and Incompleteness

Two major difficulties in using default logics are their intractability ...
02/23/2021

### Parameterized Complexity of Logic-Based Argumentation in Schaefer's Framework

Logic-based argumentation is a well-established formalism modelling nonm...
06/03/2019

### Parameterised Complexity for Abduction

Abductive reasoning is a non-monotonic formalism stemming from the work ...
06/03/2019

### Parameterised Complexity of Abduction in Schaefer's Framework

Abductive reasoning is a non-monotonic formalism stemming from the work ...
03/13/2013

### Reasoning With Qualitative Probabilities Can Be Tractable

We recently described a formalism for reasoning with if-then rules that ...
10/12/2011

### Semantic Matchmaking as Non-Monotonic Reasoning: A Description Logic Approach

Matchmaking arises when supply and demand meet in an electronic marketpl...
##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction

Over the past few decades, non-monotonic reasoning has developed to be one of the most important topics in computational logic and artificial intelligence. The non-monotonicity here refers to the fact that, while in usual (monotonic) reasoning adding more axioms leads to potentially more possible conclusions, in non-monotonic reasoning adding new facts to a knowledge base may prevent previously valid conclusions. Different ways to introduce non-monotonic aspects to classical logic have been considered:

1. The derivation process may be extended using non-monotonic inference rules.

2. The logical language may be extended with non-monotonic belief operators.

3. The definition of semantics may be changed.

In this survey we consider a logical formalism from each of the above possibilities, namely

• Reiter’s default logic (as a candidate for possibility (1) above), which introduces default inference rules of the form , where intuitively expresses that can be derived from as long as is consistent with our knowledge;

• Moore’s autoepistemic logic (a candidate for (2)), that extends classical logic with a modal operator to express the beliefs of an ideal rational agent, in the sense that expresses that is provable;

• McCarthy’s circumscription (as candidate for (3)), which restricts the semantics to the minimal models of a formula or set of formulae.

Additionally we survey abduction, where one is not interested in inferences from a given knowledge base but in computing possible explanations for an observation with respect to a given knowledge base.

Complexity results for different reasoning tasks for propositional variants of these logics have been studied already in the nineties. It was shown that in each case, the complexity is higher than for usual propositional logic (typically complete for some level of the polynomial-time hierarchy). In recent years, however, a renewed interest in complexity issues can be observed. One current focal approach is to consider parameterized problems and identify reasonable parameters that allow for FPT algorithms. In another approach, the emphasis lies on identifying fragments, i.e., restriction of the logical language, that allow more efficient algorithms for the most important reasoning tasks.

In this survey we focus on this second aspect. We describe complexity results for fragments of logical languages obtained by either restricting the allowed set of operators (e.g., forbidding negations one might consider only monotone formulae) or by considering only formulae in conjunctive normal form but with generalized clause types (which are also called Boolean constraint satisfaction problems).

The algorithmic problems we consider are suitable variants of satisfiability and implication in each of the logics, but also certain counting problems, where one is not only interested in the existence of certain objects (e.g., models of a formula) but asks for their number.

## 2 Post’s Lattice

In 1941, Post showed that the sets of Boolean functions closed under projections and arbitrary composition, called clones, form a lattice containing only countably infinite such closed sets, and he identified a finite base for each of them [Pos41]. The closure operation, denoted by , is not arbitrarily chosen but rather captures an intuitive understanding of expressiveness: Given a set of Boolean functions, denotes the set of Boolean functions expressible using functions from , or equivalently: computable with Boolean circuits with gates performing functions from . Moreover, it is well behaved with respect to computational complexity; e.g., if is a decision problem defined over Boolean circuits with gates corresponding to Boolean functions from , then for all finite sets of Boolean functions such that all functions from can be expressed in , i.e., . Similar statements hold for decision problems defined over Boolean formulae (see [Tho10a]). Post’s lattice

thus holds the key to study and classify the computational complexity of problems parameterized by finite sets of available Boolean functions. In this section, we will define the required terms and notation to introduce Post’s lattice.

Let be the set of propositional formulae, i.e., the set of formulae defined via

 φ::=a∣f(φ,…,φ),

where is a proposition and is an -ary Boolean function (we do not distinguish between connectives and their associated functions). For a finite set set of Boolean functions, a -formula is a Boolean formula using functions from only. The set of all -formulae is denoted by .

A clone is a set of Boolean functions that is closed under superposition, i.e., contains all projections (that is, the functions for and ) and is closed under arbitrary composition [Pip79]. For a set of Boolean functions, we denote by the smallest clone containing and call a base for . A -formula is called -representation of if .

Post showed that the set of all clones ordered by inclusion together with meet and join forms the lattice depicted in Figure 1. To give the list of all the clones, we need the following properties. Say that a set is -separating, , if there exists an such that implies . Let be an -ary Boolean function and define the dual of to be the Boolean function . We say that:

• is -reproducing if , ;

• is -separating if is -separating, ;

• is -separating of degree if all with are -separating;

• is monotone if implies ;

• is self-dual if ; here, ;

• is affine if with ;

• is essentially unary if depends on at most one variable.

The above properties canonically extend to sets of Boolean functions. The list of all clones is given in Table 1, where denotes the identity and is the -ary threshold function with threshold defined as , i.e., is if at least of its inputs are .

The following easy observation, which will be useful in the subsequent sections, gives a a first example of the relationship between function-restricted sets of Boolean formulae.

###### Lemma 2.1

Let be a finite set of Boolean functions and let be a set of -formulae. Then can be transformed in logspace into a set of -formulae such that the number of satisfying assignments of and coincide.

###### Proof sketch.

The idea is to add a fresh proposition to and to replace all occurrences of the constant with . As a result we obtain that, for problems defined over sets of Boolean formulae, . ∎

## 3 Default Logic

Default logic is among the best known and most successful formalisms for non-monotonic reasoning. It was proposed by Raymond Reiter in 1980 [Rei80] and extends classical logic with default rules, i.e., defeasible inference rules with an additional justification. These capture the process of deriving conclusions based on inferences of the form “in the absence of contrary information, assume …”. As with few exceptions most of our knowledge about the world is almost true rather than an absolute truth, Reiter argued that his logic is an adequate formalization of the human reasoning under the closed world assumption, which allows one to assume the negation of all facts not derivable from the given knowledge base.

Formally, a default rule is an expression of the form , where , , are formulae; is called the premise, is called justification, and is called conclusion. Further, a default theory is a pair , where is a set of formulae and is a set of default rules. The intended interpretation of default rules is that holds if can be derived and is consistent with our knowledge and beliefs about the world. It is this consistency condition that introduces the non-monotonicity:

###### Example 3.1

Consider the default theory with Then we should be able to derive from , as is a fact from and is consistent with the consequences of . However, if is extended with then is no longer consistent with the justification of and we have no right to conclude . The addition of to thus invalidates the consequence .

As another consequence, the derivable knowledge depends on the set of applied defaults:

###### Example 3.2

Consider the default theory with If we apply the left default, then is derivable while the right default is blocked, as its justifications is inconsistent with the conclusion . On the other hand, if we apply the right default first, then is derived and the left default rule gets blocked.

Thus, to appropriately represent the knowledge derivable from a default theory, we introduce the notion of stable extensions.

###### Definition 3.3 ([Rei80])

Let be a default theory and be a set of formulae. Let and Then is a stable extension of if and only if .

Stable extensions can alternatively be characterized as the least fixed points of an operator : For a given default theory and a set of formulae, let be the smallest set of formulae such that

1. ,

2. is deductively closed (i.e., ), and

3. for all with and , it holds that
(in this case, we also say that the default is applicable).

###### Proposition 3.4 ([Rei80])

Let be a default theory and a be a set of formulae. Then is a stable extension of iff is a fixed point of .

We have already observed in Example 3.2 that a default theory may possess several stable extensions; indeed, a default theory with default rules may have any number of different stable extensions between and . Thus the following questions naturally arise:

(Extension existence) Does a given default theory admit a stable extension?

(Credulous reasoning) Is a given formula contained in at least one stable extension of a given default theory?

(Skeptical reasoning) And, finally, is a given formula contained in all stable extensions of a given default theory?

The computational complexity of the corresponding decision problems was first explored by Kautz and Selman [KS91], and Stillman [Sti90], who both presented results for syntactically restricted fragments of disjunction-free default logics. In 1992, Gottlob [Got92] and Stillman [Sti92] then independently showed that the computational complexity of these questions is presumably higher than that of the corresponding satisfiability and implication problem in propositional logic:

###### Theorem 3.5 ([Got92, Sti92])

The extension existence problem and the credulous reasoning problem for default logic are -complete, whereas the skeptical reasoning problem for default logic is -complete.

More recently, Liberatore and Schaerf [LS05] showed that model checking (i.e., the task to decide whether a given assignment is a model of any extension of a given default theory) is -complete, too. And Ben-Eliyahu-Zohary [BEZ02] extended the complexity landscape of default logics with results on disjunction-free fragments dual to those studied by Kautz and Selman, and Stillman. The study of these fragments was motivated by embeddings of other formalisms into default logic. However, little was known about the complexity of not-disjunction-free default logics. In [BMTV09b], the authors devise a systematic study of the fragments of default logic obtained by restricting the set of available Boolean functions. The results provide insight into the source of the hardness of default reasoning and reveal the trade-off between expressivity and computational complexity of fragments of default logic.

To present the results, let be a finite set of Boolean functions. Say that the default theory is a -default theory if and let -default logic denote default logic restricted to -default theories.

###### Theorem 3.6 ([BMTV09b])

Let be a finite set of Boolean functions. Then the extension existence problem for -default logic is

1. -complete if or ,

2. -complete if ,

3. -complete if ,

4. -complete if ,

5. -complete if , and

6. trivial in all other cases (that is, if ),

via logspace many-one reductions.

A key observation to the proof of this theorem is the following lemma.

###### Lemma 3.7

Let be a finite set of Boolean functions. If then any -default theory has at most one stable extension; if then any -default theory has at exactly one stable extension.

###### Proof.

Suppose first that . Then each function is either -reproducing or equivalent to . Default rules with a justification equivalent to are not applicable unless is inconsistent. As in this case is the only stable extension of the default theory [Rei80, Corollary 2.3], we w.l.o.g. suppose that all justifications are -reproducing, i.e., . Now observe that the negation of a -reproducing function is not -reproducing while all consequents of -reproducing functions are. Indeed, if then any stable extension of a default theory is consistent and satisfied by the assignment setting to all propositions, which also satisfies the justification of each default rule. Thus any monotone default theory may possess at most one stable extension while any -reproducing default theory possesses exactly one stable extension. ∎

We will sketch the proof of Theorem 3.6.

###### Proof sketch.

For or , the -hardness follows from Theorem 3.5 and Lemma 2.1, as and the upper bound easily generalizes from to arbitrary sets of Boolean functions.

The case follows directly from Lemma 3.7. It hence remains to consider those sets such that contains the constant . These are all included in either the clone or the clone (or both).

For , membership in follows similarly from Lemmas 2.1 and 3.7: the only way for a monotone default theory not to possess a stable extension is to contain a default rule such that . It thus suffices to compute the set of applicable defaults using subsequent calls to a -oracle for -formula implication and to verify that their conclusions are satisfied by the assignment setting to all propositions. It is straightforward to implement this as a -algorithm. The -hardness on the other hand is established using a reduction from the sequentially nested satisfiability problem, which was first identified to be -complete in [Got95a, Theorem 3.4] (see also [LMS01]).

If one further restricts the set such that (i.e., does contain the Boolean constants and either conjunctions or disjunctions), then formula implication and the the extension existence problem become tractable [BMTV09a]. Indeed, the problem becomes -complete, as can be shown by a reduction from a variant of the circuit value problem.

Further restricting the set such that (i.e., does only contain the Boolean constants) leads to default theories with rules whose premise and conclusion are a single proposition. As a result, the existence of a stable extension reduces to the complement of the reachability problem in directed graphs. Similarly, reachability in such graphs can be transformed into the question whether a stable extension does not have a stable extension. As this problem is -complete and is closed under complement, the extension existence problem for such default theories is -complete.

Finally, for (i.e., and is affine), we observe a different situation. There may exist exponentially many different stable extensions; yet, the verification of a candidate is tractable because implication and satisfiability of -formulae are [BMTV09a]. Hence, the extension existence problem becomes solvable in . The -hardness, on the other hand, is obtained by reducing from the satisfiability problem for 3CNF formulae: given a formula with , we construct the default theory with

 D:={1:xixi,1:¬xi¬xi∣∣∣xi∈Vars(φ)}∪{¯¯¯¯¯¯ℓi1:¯¯¯¯¯¯ℓi2ℓi3∣∣∣1≤i≤n},

where, for a literal , denotes the literal of opposite polarity, and for a formula , denotes the set of all variables in . It is easy to verify the correctness of this reduction. As the above default theory can easily be written as a -default theory for all such that , the proof is complete. ∎

###### Remark 3.8

As default rules require the justification to be consistent with a stable extension (i.e., ), another conceivable formalization of -default logic would be to require and to be -formulae and to be the negation of a -formula. For this formalization, the extension existence problem for -default logic is -complete if or or , and tractable otherwise (with this case splitting into -complete cases and logspace-solvable cases).

Given the upper and lower bounds for the stable extension problem it is easy to settle the complexity of the credulous and skeptical reasoning problem. Define the credulous (resp. skeptical) reasoning problem for -default logic as the problem to decide, given a -default theory and a -formula , whether is contained in a stable extension (resp. all stable extensions) of .

###### Theorem 3.9 ([BMTV09b])

Let be a finite set of Boolean functions. Then the credulous (resp. skeptical) reasoning problem for -default logic is

1. -complete (resp. -complete) if or ,

2. -complete if ,

3. -complete if or or ,

4. -complete (resp. -complete) if ,

5. -complete if or or , and

6. -complete in all other cases (that is, if ),

via logspace many-one reductions.

Another possibility to study fragments of default logic, aside restricting the available Boolean functions, is Schaefer’s framework [Sch78]. This framework is motivated by constraint satisfaction problem, where a set of conditions represented as logical relations has to be simultaneously satisfied. Hence, the set and the formulae occurring in are assumed to be a set of applications of relations from to the variables in , where is a shorthand for . That is, and the formulae occurring in are of the form , where the ’s are relations of arity from a fixed set of available relations over the domain and the variables are from . Such a set of applications of relations is correspondingly said to be satisfied by an assignment if for all . Call a relation Schaefer if it is either

• affine (coincides with the set of models of an -formula),

• bijunctive (coincides with the set of models of a 2CNF formula),

• Horn (coincides with the set of models of a Horn formula), or

• dual Horn (coincides with the set of models of a dual Horn formula).

And say that a set of relations is Schaefer if there is one of the above four properties that is satisfied by all relations in . Call a default theory such that is a set of applications of relations from a default theory over relations from .

We define the extension existence problem for default logic over relations from as the problem to decide, given a default theory over relations from , whether has a stable extension. Further, define the credulous (resp. skeptical) reasoning problem for default logic over relations from as the problem to decide, given a default theory over relations from and a set of applications of relations from , whether is contained in at least one (resp. any) stable extension of . In [CHS07], Chapdelaine et al. study the complexity of these problems and establish the following trichotomies:

###### Theorem 3.10 ([Chs07, Sch07])

Let be a set of relations. Then the extension existence problem for default logic over relations from is

1. -complete if is not Schaefer,

2. -complete if is Schaefer but neither -valid or -valid,

3. in in all other cases.

###### Theorem 3.11 ([Chs07, Sch07])

Let be a set of relations. Then the credulous (resp. skeptical) reasoning problem for default logic over relations from is

1. -complete if is not Schaefer,

2. -complete (resp. -complete) if is Schaefer but neither -valid nor -valid,

3. -complete if -valid or -valid but not Schaefer,

4. in in all other cases.

For detailed proofs of these results, see [Sch07].

We like to remark that also results like these about Boolean constraint satisfaction problems are proved using Post’s lattice. This is because the classes of Boolean relations above (affince, bijunctive, Horn, dual Horn) can all be defined using so called polymorphism, a kind of closure property, of Boolean relations. These sets of polymorphism are always clones, i.e., they appear somewhere in the lattice. To state only one example, a relation is Horn iff the set of its polymorphisms is the class . The structure of the lattice is then used in the proof to make a case distinction on all possible sets of polymorphisms of and determine the complexity in each case. For more details, we refer the reader to [CV08].

Having settled the complexity of these decision problems, mind that these results do only speak about the existence of objects, e.g., stable extensions. But what about the complexity of counting them? We will conclude this survey of the complexity of default logic with a treatment of the problem to count the number stable extensions.

Let us introduce the relevant notions and counting complexity classes first. For alphabets and , let be a binary relation such that the set is finite for all . We write to denote the following counting problem: Given , compute . The class of counting problems computable in polynomial time is denoted by . To characterize the complexity of counting problems that are not known to be in , we follow [HV95] and define an operator on classes of decision problems: if (a) there exists a polynomial , such that for all and all , and (b) the problem to decide, given and , whether is in . Clearly, coincides with

, the class of functions counting the number of accepting path of nondeterministic polynomial-time Turing machines—the natural analogue of

in the counting complexity context [Val79]. Applying to the classes of the polynomial hierarchy, we now obtain a linearly ordered hierarchy of counting complexity classes [Tod91, HV95]:

The counting complexity of default logic has, to the authors’ best knowledge, first been considered in [Tho10b]. There, it was shown that counting the number of stable extensions is complete for the second level of the counting polynomial hierarchy, , whenever ; becomes -complete for all sets such that ; complete for the first level of the counting hierarchy for all affine sets such that can be implemented from

; and becomes efficiently computable in all other cases. The counting complexity thus decreases analogously to the complexity of the extension existence problem. However, observe that we blur over the distinction between decision problems and their characteristic functions: By Lemma

3.7 any monotone -default theory has at most one stable extension. The problem to count the number stable extensions thus coincides with the characteristic function of the extension existence problem, which -complete.

###### Theorem 3.12 ([Tho10b])

Let be a finite set of Boolean functions. Then the problem to count the number of stable extensions in -default logic is

1. -complete if or ,

2. -complete if ,

3. -complete if ,

4. in in all other cases (that is, if or or )

via parsimonious reductions.

Note that for the classification in Theorem 3.12 the conceptually simple and well-behaved parsimonious reductions are sufficient (a counting problem parsimoniously reduces to a counting problem if there is a polynomial-time computable function such that for all inputs , [Val79]), while for related classifications in the literature less restrictive and more complicated reductions had to be used (see, e.g., [DHK05, HP07, BBC10]).

## 4 Autoepistemic Logic

Autoepistemic logic was introduced by Moore [Moo85] to overcome some of the peculiarities of the non-monotonic logics devised by McDermott and Doyle [MD80] and McDermott [McD82]. While Moore defined autoepistemic logic without referring to any particular modal system, it turned out that his logic coincides with the non-monotonic modal logic based on KD45 [Shv90]. Therefore, autoepistemic logic can be considered a popular representative among the non-monotonic modal logics. The connection of these logics and particularly autoepistemic logic to default logic has been extensively studied. The first major approach in this direction was taken by Konolige [Kon88], who showed that default logic can be embedded into autoepistemic logic using slightly different semantics for the latter. Subsequently, Marek and Truszczynski [MT89, MT90] showed that, using strengthened notions in autoepistemic logic or weakened notions in default logic, the two logics coincide in terms of expressivity. Finally, Gottlob [Got95b] showed that default logic can be embedded into standard autoepistemic logic, while the converse direction was shown to hold by Janhunen [Jan99].

The intention of Moore was to create a logic modelling the beliefs of an ideally rational agent, i.e., an agent that believes all things he can deduce and refutes belief in everything else. To this end, autoepistemic logic extends propositional logic with the modal operator stating that its argument “is believed”. The set of all autoepistemic formulae is defined as

 φ::=a∣f(φ,…,φ)∣Lφ,

where is a proposition and is a Boolean function, and the relation is extended to simply treat formulae starting with an as atomic. Similarly to default logic, the semantics of autoepistemic logic are defined in terms of fixed points, which in the context of autoepistemic logic are called stable expansions:

###### Definition 4.1 ([Moo85])

Let be a set of autoepistemic formulae. A set is a stable expansion of if it satisfies the equation

 Δ=Th(Σ∪L(Δ)∪¬L(Lae∖Δ)),

where and .

###### Example 4.2

Consider the set of autoepistemic formulae. We claim that has two stable expansions, each of which containing . Sticking with our informal interpretation of autoepistemic logic as the logic of an ideally rational agent’s beliefs, observe that we cannot deduce from . Hence, we would assign the value and consequently be able to derive from . This in turn allows us to conclude from , as is derivable from . Indeed, the set of formulae recursively defined via and is a stable expansion of that contains .

On the other hand, we are not able to deduce from either. Hence, we could also continue to assign to the value and be therefore able to derive . Again, we may conclude from . And just as above, with and defined as is a stable expansion of that contains .

There is an important difference to default logic as stable expansions need not be minimal fixed points:

###### Example 4.3

Consider . The set has two stable expansions, one stable expansion containing and the other one containing . As an iterative construction as in Example 4.2 is deemed to fail for the latter, it may be considered ungrounded in the set of premises .

Clearly, sets of autoepistemic formulae can also posses no or a single stable expansion. Hence, the expansion existence problem, the credulous reasoning problem and the skeptical reasoning problem arise just as in default logic. The first treatment of the complexity of these problems has been performed by Niemelä [Nie90]. In his paper, he gave a finite characterization of stable expansions in terms of full sets: Let denote the set of -prefixed subformulae of formulae in .

###### Definition 4.4 ([Nie90])

Let be a set of autoepistemic formulae. A set is -full if for all :

• iff .

• iff .

###### Proposition 4.5 ([Nie90])

Let be a set of autoepistemic formulae. If is a -full set, then there exists exactly one stable expansion of such that . Vice versa, if is a stable expansion of , then there exists exactly one -full set such that such that .

Using full sets as finite representations for stable expansions, Niemelä obtained a upper bound for the expansion existence problem and the credulous reasoning problem, and a upper bound for the skeptical reasoning problem: for the expansion existence problem, simply guess a candidate for a full set and verify the conditions given in Definition 4.4 using an oracle for formula implication. To extend this idea to the credulous and skeptical reasoning problem, one still needs to define a consequence relation that, given and a -full set implies exactly those formulae contained in the stable expansion corresponding to . Niemelä shows that can be defined such that the problem of deciding the relation Turing reduces to the implication problem. From this it is easy to see that the credulous and skeptical reasoning problem are contained in and respectively. The matching lower bounds were later established by Gottlob in [Got92].

###### Theorem 4.6 ([Nie90, Got92])

The expansion existence problem and the credulous reasoning problem for autoepistemic logic are -complete, whereas the skeptical reasoning problem for autoepistemic logic is -complete.

The hardness is obtained using a surprisingly simple reduction form the validity problem for quantified Boolean formulae of the form , where is a propositional formula with . Given a formula of the above form, we transform it into a set of autoepistemic formulae defined as

 Σ:={Lxi↔xi∣1≤i≤n}∪{Lψ}.

The idea behind the reduction is to mimic existential quantification using different sets of beliefs such that an assignment satisfying results in a stable expansion: If is an assignment that satisfies , then the with entails ; therefore, is a -full set. On the other hand, if is a full set, then we can reconstruct from it an assignment satisfying .

Beyond this, the complexity of these problems for fragments of autoepistemic logic has seemingly only been studied in [CMTV10]. There, it was shown that already autoepistemic logic using only and is -complete and that tractable fragments occur only for affine sets of Boolean functions. To present the results, say that, for a finite set of Boolean functions, an autoepistemic -formula is an autoepistemic formula using Boolean functions from a given finite set only, and denote by the set of all autoepistemic -formulae. Further, let -autoepistemic logic denote autoepistemic logic restricted to autoepistemic -formulae and define the credulous (resp. skeptical) reasoning problem for -autoepistemic logic as the problem to decide, given a set of autoepistemic -formulae and an autoepistemic -formula , whether is contained in a stable expansion (resp. all stable expansions) of .

###### Theorem 4.7 ([Cmtv10])

Let be a finite set of Boolean functions. Then the expansion existence problem and the credulous (resp. skeptical) reasoning problem for -autoepistemic logic are

1. -complete (resp. -complete) if or or ,

2. -complete (resp. -complete) if ,

3. -hard and contained in if ,

4. in in all other cases (that is, if ),

via logspace many-one reductions.

Note that the complexity classification of these problems substantially differs from their analogues in default logic, which can be credited to the different approach to modelling non-monotonicity: while default logic is limited to consistency testing in the justification of a default rule, autoepistemic logic is capable of both positive and negative introspection. As another result, in general the intertranslatability of autoepistemic logic and default logic does not hold for fragments of these logics (unless collapses considered unlikely occur).

We will briefly present the ideas behind Theorem 4.7. To start with, the proof relies on the following lemma, which significantly reduces the number of clones to be considered.

###### Lemma 4.8

Let be a finite set of Boolean functions and . Then we can construct in logspace a set such that the stable expansions of and coincide on all autoepistemic formulae over .

###### Proof.

Let be given. We first eliminate the constant using Lemma 2.1 and transform the resulting set to by substituting all occurrences of with the formula , where is a fresh proposition. Suppose that is a consistent stable expansion of . As cannot be derived from , has to contain the . Hence any satisfying assignment of sets to . ∎

It thus suffices to consider the complexity of the expansion existence problem for sets such that . The key observation in the proof of Theorem 4.7 is that the reductions from the validity problem for quantified Boolean formulae of the form with in negation normal form does not requires negations: Given a formula in the above form, replace all negative literals and in with new propositions and . Call the resulting formula . We then construct the set of autoepistemic -formulae as

 Σ:={Lφ′}∪{Lxi∨x′i,xi∨Lx′i∣1≤i≤n}∪{yi∨y′i∣1≤i≤m}.

Due to the formulae , , any stable expansion of contains either or (but not both), while the formulae , , guarantee that either or is set to in any satisfying assignment of . From this, it is easy to see that is valid iff has a stable expansion. Hence, the expansion existence problem is -complete for , , or .

This construction can be generalized to also work for quantified Boolean formulae of the form with in conjunctive normal form. Hence, we obtain -hardness for . The corresponding upper bound follows from the fact the formula implication of -formulae with is tractable.

For an argument to obtain a polynomial-time upper bound for the remaining cases, we refer the reader to the original paper. This concludes the discussion of Theorem 4.7.

Turning to the problem of counting the number of stable expansions, the reader may easily convince himself that the reductions used to establish the -hardness and -hardness above are parsimonious. Hence, the complexity classification of the counting problem is analogous to that of the expansion existence problem:

###### Theorem 4.9 ([Cmtv10])

Let be a finite set of Boolean functions. Then the problem to count the number of stable expansions in -autoepistemic logic is

1. -complete if or or ,

2. -complete if ,

3. in in all other cases (that is, if or ),

via parsimonious reductions.

Note that again parsimonious reductions are sufficient to obtain the completeness results in Theorem 4.9.

## 5 Circumscription

The third non-monotonic logic we will turn to is circumscription, which instead of extending classical logic with default rules or introspection restricts the attention to minimal models. Circumscription was introduced by McCarthy [McC80] in 1980 to overcome the qualification problem, i.e., the problem of listing all preconditions required for an action to have its intended effect. His approach was to allow for the conclusion that the objects that can be shown to have a certain property by reasoning are all objects that satisfy this property. Following [Lif85], this is achieved by considering only those models that are minimal with respect to a preorder on the set of assignments. For ease of notation, we will identify assignments with the set .

###### Definition 5.1

Let , , partition the set of propositions and let be assignments. Define as the preorder defined by

 σ≤(P,Z)σ′⟺σ∩P⊆σ′∩P and σ∩Q=σ′∩Q.

Using , we define a consequence relation such that for an assignment and a set of formulae with , if is minimal w.r.t.  among all models of .. In this case, is also called a circumscriptive model of . Accordingly, we can define the notion of (circumscriptive) implication: if is satisfied in all circumscriptive models of . It is not hard to see that circumscription coincides with reasoning under the extended closed world assumption, in which all formulae involving only propositions from that cannot be derived from are assumed to be false [GPP89].

###### Example 5.2

Let , , and . The models of are , , , , , , , and . Of these, only , , and are minimal with respect to . Hence, , while .

The notions of circumscriptive models and circumscriptive inference naturally lead to the following decision problems, that received extensive study in the literature:

(Circumscriptive model checking) Given a set of formulae , a preorder on the set of its propositions and an assignment , does hold?

(Circumscriptive inference) Given a set of formulae and a formula , a preorder on the set of their propositions, does hold?

The circumscriptive model checking is dual to a generalization of the minimal satisfiability problem, i.e., the question whether a given formula has a model that is strictly smaller than a given assignment with respect to a given preorder , and known to be -complete in general [Cad92], whereas the circumscriptive inference problem was shown to be -complete by Eiter and Gottlob in [EG93]. These results reveal that, alike default logic and autoepistemic logic, circumscription exhibits an increase in the complexity of model checking and reasoning as compared to traditional propositional logic. This increase in the complexity raises the question for restrictions that lower the complexity of these tasks. Accordingly, the complexity of these problems has been studied for both restricted sets of Boolean functions and in Schaefers framework. We will consider the restrictions obtained from Schaefer’s framework first.

Define the circumscriptive model checking problem for sets of relations from as the problem to decide, given a a set of applications of relations from , an assignment and a partition of , whether is a minimal model of with respect to . In [KK01b], Kirousis and Kolaitis showed that using Schaefer’s framework, the circumscriptive model checking problem restricted to is dichotomic, a result which was later generalized to the general case in [KK03]:

###### Theorem 5.3 ([Kk03])

Let be a set of relations. Then the circumscriptive model checking problem for sets of relations from is

1. -complete if is not Schaefer and

2. in in all other cases.

The tractability if is Schaefer is easy to verify. In this case, the circumscriptive model checking problem Turing reduces to the satisfiability problem, which in this case is tractable by [Sch78]. To show the -hardness in all remaining cases, Kirousis and Kolaitis give an involved three step reduction from --.

In addition to that, Kirousis and Kolaitis also classified for possible sets of available Boolean functions in an unpublished note. Define the circumscriptive model checking problem for sets of -formulae as the problem to decide, given a set , an assignment and a partition