The constraint satisfaction problem (CSP) involves deciding, given a set of variables and a set of constraints on the variables, whether or not there is an assignment to the variables satisfying all of the constraints. Cases of the constraint satisfaction problem appear in many fields of study, including artificial intelligence, spatial and temporal reasoning, logic, combinatorics, and algebra. Indeed, the constraint satisfaction problem is flexible in that it admits a number of equivalent formulations. In this paper, we work with the formulation as the relational homomorphism problem: given two similar relational structuresand , does there exist a homomorphism from to ? In this formulation, one can view each relation of as containing variable tuples that are constrained together, and the corresponding relation of as containing the permissible values for the variable tuples .
The constraint satisfaction problem is in general NP-hard; this general intractability has motivated the study of restricted versions of the CSP that have various desirable complexity and algorithmic properties. A natural and well-studied way to restrict the CSP is to fix the value relations that can be used to pose constraints; in the homomorphism formulation, this corresponds to fixing the right-hand side structure , which is also known as the constraint language. Each structure then gives rise to a problem , and one obtains a rich family of problems that include boolean satisfiability problems, graph homomorphism problems, and satisfiability problems on algebraic equations. One of the primary current research threads involving such problems is to understand for which finite-universe constraint languages the problem is polynomial-time tractable ; there is also work on characterizing the languages for which the problem is contained in lower complexity classes such as L (logarithmic space) and NL (non-deterministic logarithmic space) [13, 21]. With such aims providing motivation, there have been efforts to characterize the languages amenable to solution by certain algorithmic techniques, notably, representing solution spaces by generating sets  and consistency methods [22, 2, 6], which we now turn to discuss.
Checking for consistency is a primary reasoning technique for the practical solution of the CSP, and has been studied theoretically from many viewpoints [22, 2, 4, 1, 3, 6, 5]. The most basic and simplest form of consistency is arc consistency, which algorithmically involves performing inferences concerning the set of feasible values for each variable. The question of how to efficiently implement an arc consistency check has been studied intensely, and highly optimized implementations that are linear in both time and space have been presented. In general, a consistency check typically involves running an efficient method that performs inference on bounded-size sets of variables, and which can sometimes detect that a CSP instance is inconsistent and has no solution. While these methods exhibit one-sided error in that they do not catch all non-soluble CSP instances (as one expects from the conjunction of their efficiency and the intractability of the CSP), it has been shown that, for certain constraint languages, they can serve as complete decision procedures, by which is meant, they detect an inconsistency if (and only if) an instance has no solution. As an example, unit propagation, a consistency method that can be viewed as arc consistency specialized to SAT formulas, is well-known to decide the Horn-SAT problem in this sense.
In this paper, we study arc consistency and three natural extensions thereof from the perspective of constraint languages. The extensions of AC that we study are look-ahead arc consistency (LAAC) ; peek arc consistency (PAC) , and singleton arc consistency (SAC) [16, 7]. Each of these algorithms is natural, conceptually simple, readily understandable, and easily implementable using arc consistency as a black box. Tractability results for constraint languages have been presented for AC by Feder and Vardi  (for instance); and for LAAC and PAC in the previously cited work. In fact, for each of these three algorithms, characterizations of the class of tractable languages have been given, as we discuss in the paper.
We give a uniform presentation of these algorithms (Section 3), and conduct a comparison of these algorithms on the basis of which languages they solve (Section 4). Our comparison shows, roughly, that the algorithms can be placed into a hierarchy: solvability of a language by AC or LAAC implies solvability by PAC; solvability by PAC in turn implies solvability by SAC (see Section 4 for precise statements). We also study the strictness of the containments shown. We thus contribute to a basic, foundational understanding of the scope of these algorithms and of the situations in which these algorithms can be demonstrated to be effective.
We then present new tractability results for singleton arc consistency (Section 5). We prove that languages having certain types of 2-semilattice polymorphisms can be solved by singleton arc consistency; and, we prove that any language having a majority polymorphism is solvable by singleton arc consistency. The presence of a majority polymorphism is a robust and well-studied condition: majority polymorphisms were used to give some of the initial language tractability results, are known to exactly characterize the languages such that 3-consistency implies global consistency (we refer to  for definitions and more details), and gave one of the first large classes of languages whose constraint satisfaction problem could be placed in non-deterministic logarithmic space . While the languages that we study are already known to be polynomial-time tractable [20, 10], from the standpoint of understanding the complexity and algorithmic properties of constraint languages, we believe our tractability results to be particularly attractive for a couple of reasons. First, relative to a fixed language, singleton arc consistency runs in quadratic time , constituting a highly non-trivial running time improvement over the cubic time bound that was previously known for the studied languages. Also, in showing that these languages are amenable to solution by singleton arc consistency, we demonstrate their polynomial-time tractability in an alternative fashion via an algorithm that is different from the previously used ones; the techniques that we employ expose a different type of structure in the studied constraint languages.
Our definitions and notation are fairly standard. For an integer , we use the notation to denote the set containing the first positive integers, that is, the set .
A tuple over a set is an element of for a value called the arity of the tuple; when is a tuple, we often use the notation to denote its entries. A relation over a set is a subset of for a value called the arity of the relation. We use to denote the operator that projects onto the th coordinate: denotes the th entry of a tuple , and for a relation we define . Similarly, for a subset whose elements are , we use to denote the tuple , and we define .
A signature is a set of symbols, each of which has an associated arity. A structure over signature consists of a universe , which is a set, and a relation for each symbol of arity . (Note that in this paper, we are concerned only with relational structures, which we refer to simply as structures.) Throughout, we will use the bold capital letters to denote structures, and the corresponding non-bold capital letters to denote their universes. We say that a structure is finite if its universe has finite size. Unless stated otherwise, we assume all structures under discussion in this paper to be finite. We say that a structure has all constants if for each , there is a relation symbol with .
When two structures and are defined over the same signature , we say that they are similar. We define the following notions on similar structures. For similar structures and over a signature , we say that is an induced substructure of if and for every of arity , it holds that . Observe that for a structure and a subset , there is exactly one induced substructure of with universe . For similar structures and over a signature , the product structure is defined to be the structure with universe and such that for all . We use to denote the -fold product .
We say that a structure over signature is an expansion of another structure over signature if (1) , (2) the universe of is equal to the universe of , and (3) for every symbol , it holds that . We will use the following non-standard notation. For any structure (over signature ) and any subset , we define to be the expansion of with the signature where is a new symbol of arity , defined by and for all . More generally, for a structure (over ) and a sequence of subsets , we define to be the expansion of with the signature where are new symbols of arity , defined by for all , and for all .
Homomorphisms and the constraint satisfaction problem.
For similar structures and over the signature , a homomorphism from to is a mapping such that for every symbol of and every tuple , it holds that . We use to indicate that there is a homomorphism from to ; when this holds, we also say that is homomorphic to . It is well-known and straightforward to verify that the homomorphism relation is transitive, that is, if and , then .
The constraint satisfaction problem (CSP) is the problem of deciding, given as input a pair of similar structures, whether or not there exists a homomorphism from to . When is an instance of the CSP, we will also call a homomorphism from to a satisfying assignment; say that the instance is satisfiable if there exists such a homomorphism; and, say that the instance is unsatisfiable if there does not exist such a homomorphism. We generally assume that in an instance of the CSP, the left-hand side structure contains finitely many tuples. For any structure (over ), the constraint satisfaction problem for , denoted by , is the constraint satisfaction problem where the right-hand side structure is fixed to be , that is, the problem of deciding, given as input a structure over , whether or not there exists a homomorphism from to . In discussing a problem of the form , the structure is often referred to as the template or constraint language. There are several equivalent definitions of the constraint satisfaction problem. For instance, in logic, the constraint satisfaction problem can be formulated as the model checking problem for primitive positive sentences over relational structures, and in database theory, it can be formulated as the containment problem for conjunctive queries .
When is an operation on and
are tuples of the same arity over , we use to denote the arity tuple obtained by applying coordinatewise, that is,
An operation is a polymorphism of a structure over if for every symbol and any tuples , it holds that . That is, each relation is closed under the action of . Equivalently, an operation is a polymorphism of if it is a homomorphism from to .
In this section, we give a uniform presentation of the four algorithms under investigation in this paper: arc consistency, look-ahead arc consistency, peek arc consistency, and singleton arc consistency, presented in Sections 3.1, 3.2, 3.3, and 3.4, respectively. The results on the first three algorithms come from previous work, as we discuss in presenting each of these algorithms; for singleton arc consistency, we here develop results similar to those given for the other algorithms.
Our treatment of arc consistency, peek arc consistency, and singleton arc consistency is uniform: for each of these algorithms, we present a homomorphism-based consistency condition, we show that the algorithm checks precisely this consistency condition, and we give an algebraic condition describing the structures such that the algorithm solves . These three algorithms give one-sided consistency checks: each either correctly rejects an instance as unsatisfiable or outputs “?”, which can be interpreted as a report that it is unknown whether or not the instance is satisfiable. The other algorithm, look-ahead arc consistency, has a somewhat different character. It attempts to build a satisfying assignment one variable at a time, using arc consistency as a filtering criterion; it either returns a satisfying assignment, or outputs “?”.
Throughout this section and in later sections, we will make use of a structure that is defined for every structure , as follows [18, 15]. For a structure (over ), we define to be the structure with universe and where, for every symbol of arity , . Here, denotes the power set of the set .
3.1 Arc Consistency
We now present the arc consistency algorithm. The main idea of the algorithm is to associate to each element a set of values which, throughout the execution of the algorithm, has the property that for any solution , it must hold that . The algorithm continually shrinks the sets in a natural fashion until they stabilize; at this point, if some set is the empty set, then no solution can exist, and the algorithm rejects the instance.
Feder and Vardi  have studied arc consistency, under an equivalent formulation in terms of Datalog Programs, for constraint languages. The results in this section are due to this reference. The connection of the results in Feder and Vardi with arc consistency was made explicit in Dalmau and Pearson .
An instance has the arc consistency condition (ACC) if there exists a homomorphism from to .
The arc consistency algorithm does not reject an instance if and only if the instance has the ACC.
Let be a structure. We say that arc consistency solves if for all structures , the following holds: has the ACC implies that there is a homomorphism .
Note that the converse of the condition given in this definition always holds: if is a homomorphism from to , then the mapping sending each to the set is a homomorphism from to .
Let be a structure. Arc consistency solves if and only if there is a homomorphism .
3.2 Look-Ahead Arc Consistency
We now present the look-ahead arc consistency algorithm. It attempts to construct a satisfying assignment by setting one variable at a time, using arc consistency as a filter to find a suitable value for each variable.
Look-ahead arc consistency was introduced and studied by Chen and Dalmau , and the theorem that follows is due to them. This algorithm can be viewed as a generalization of an algorithm for SAT studied by Del Val .
Let be a structure. We say that look-ahead arc consistency solves if for all structures , the following holds: if there exists a homomorphism , then the look-ahead arc consistency algorithm, given , outputs such a homomorphism.
Let be a structure. Look-ahead arc consistency solves if and only if there is a homomorphism such that for all .
3.3 Peek Arc Consistency
We now present the peek arc consistency algorithm. It attempts to find, for each variable , a value such that when is set to , the arc consistency check is passed.
Peek arc consistency was introduced and studied by Bodirsky and Chen ; the notions and results that follow come from them. In their work, the algorithm is shown to solve certain constraint languages, including some languages having infinite-size universes; such languages actually gave the motivation for introducing the algorithm. In this work, it is pointed out that peek arc consistency can be readily parallelized; by invoking the arc consistency checks independently in parallel, one can achieve a linear parallel running time.
An instance has the peek arc consistency condition (PACC) if for every element , there exists a homomorphism from to such that is a singleton.
The peek arc consistency algorithm does not reject an instance if and only if the instance has the PACC.
Let be a structure. We say that peek arc consistency solves if for all structures , the following holds: has the PACC implies that there is a homomorphism .
The converse of the condition given in this definition always holds. Suppose that is a homomorphism from to ; then, the mapping taking each to the singleton is a homomorphism from to and hence has the PACC.
We use the notation to denote the induced substructure of whose universe contains an -tuple of if and only if at least one coordinate of the tuple is a singleton.
Let be a structure. Peek arc consistency solves if and only if for all there is a homomorphism .
3.4 Singleton Arc Consistency
We now present the singleton arc consistency algorithm. As with arc consistency, this algorithm associates to each element a set of feasible values. It then continually checks, for pairs with and , whether or not arc consistency can be established with respect to the sets and when is assigned to ; if for some pair it cannot, then is removed from the set . As with arc consistency, this algorithm’s outer loop runs until the sets stabilize, and the algorithm rejects if one of the sets is equal to the empty set.
Singleton arc consistency was introduced by Debruyne and Bessiere . We now give a development of singleton arc consistency analogous to that of arc consistency and peek arc consistency.
An instance has the singleton arc consistency condition (SACC) if there exists a mapping such that for all , there exists a homomorphism where:
for all , it holds that .
The singleton arc consistency algorithm does not reject an instance if and only if the instance has the SACC.
Proof. Suppose that the singleton arc consistency algorithm does not reject an instance . Let denote the sets computed by the algorithm at the point of termination, and define to be the mapping where for all . Let and . By the definition of the algorithm, the pair has the ACC, and thus the desired homomorphism exists.
Now, suppose that the instance has the SACC, and let be a mapping with the described properties. We show that throughout the execution of the algorithm, it holds that for all . First, is initialized with for every . Next, we show that when and , then is never removed from by the algorithm. This is because by definition of SACC, there exists a homomorphism with such that for all , it holds that . Since by the inductive assumption, has the ACC and hence the algorithm does not remove from .
Let be a structure. We say that singleton arc consistency solves if for all structures , the following holds: has the SACC implies that there is a homomorphism .
The converse of the condition given in this definition always holds: suppose that is a homomorphism from to . Then, the instance has the SACC via the mapping where for all and the mappings defined by for all .
We use the notation to denote the induced substructure of whose universe contains an -tuple of if and only if it holds that .
Let be a structure. Singleton arc consistency solves if and only if for all there is a homomorphism .
Proof. First we show that if singleton arc consistency solves , then there is a homomorphism from to for all . Let ; we show that has the SACC. Then, there is a homomorphism from to , since the singleton arc consistency algorithm solves CSP().
Let be the mapping for all tuples of . Now let us consider an arbitrary tuple of and an arbitrary . Since , there is an such that . Thus, the homomorphism that projects onto the th coordinate satisfies , and for all tuples of , it holds that . Hence, has the SACC.
For the other direction, we show that if there is a homomorphism from to for all , then singleton arc consistency solves . Thus, we have to show that there exists a homomorphism from to if has the SACC. Let be the homomorphism from the definition of SACC, and let us use to denote the set of homomorphisms. Further, let be the homomorphism . Now, for every element the image is a tuple of : for every , it holds that and thus there exists a homomorphism that maps to the singleton ; so, we have . Since is a homomorphism from to , we can compose and a homomorphism from to , which we know to exist by assumption, to get a homomorphism from to . Consequently, singleton arc consistency solves .
4 Strength Comparison
In this section, we investigate relationships among the sets of structures solvable by the various algorithms presented. We show that for the structures having all constants, AC solves a strictly smaller set of structures than LAAC does; on the other hand, we show that there is a structure (not having all constants) solvable by AC but not LAAC. We then show that the structures solvable by AC or LAAC are strictly contained in those solvable by PAC; and, in turn, that the structures solvable by PAC are strictly contained in those solvable by SAC. We also show that the structures solvable by SAC (and hence, those solvable by any of the studied algorithms) all fall into the class of structures having bounded width; bounded width is a well-studied condition admitting multiple characterizations [18, 22, 6].
Suppose that is a structure having all constants. If is solvable by AC, then it is solvable by LAAC.
Proof. By Theorem 4, there is a homomorphism . Since the structure has all constants, for each there is a relation symbol with . Since , it must hold that , from which it follows that . The mapping defined by is then a homomorphism of the type described in Theorem 6.
There exists a structure having all constants such that is solvable by LAAC but not by AC.
Proof. Take to be the relational structure with universe over signature where
It is straightforward to verify that the mapping defined by , , and for all is a homomorphism from to satisfying the condition of Theorem 6. Hence, the problem is solvable by LAAC.
To show that the problem is not solvable by AC, let be an arbitrary mapping from to . We show that cannot be a homomorphism from to , which suffices by Theorem 4. Let . It holds that , but , and we are done.
There exists a structure (not having all constants) such that is solvable by AC but not by LAAC.
Proof. Take to be the relational structure with universe over signature where and . The mapping that sends each element of to is a homomorphism from to , and hence AC solves by Theorem 4.
To show that the problem is not solvable by LAAC, let be an arbitrary mapping from to that satisfies for all . We show that cannot be a homomorphism from to , which suffices by Theorem 6. We consider two cases depending on the value of .
If , then we use the facts that and that ; we have that , which is not contained in , implying that is not a homomorphism of the desired type.
If , then we use the facts that and that ; we have that , which is not contained in , implying that is not a homomorphism of the desired type.
We now proceed to study PAC, and in particular, show that the structures solvable by AC or LAAC are solvable by PAC.
Let be a structure. If is solvable by AC, then it is also solvable by PAC.
Let be a structure. If is solvable by LAAC, then it is also solvable by PAC.
Proof. Suppose that look-ahead arc consistency solves CSP(). By Theorem 6 there exists a homomorphism such that for all . We want to show that peek arc consistency solves CSP() by using Theorem 10. Thus, we have to show that for all there is a homomorphism .
Let . Let us consider the mapping with
defined for all tuples and all . First we want to show that is well defined. Let with , let and let be an index such that is a singleton. Let for a . We obtain that
|with , because is applied to the singleton and . Similarly, we obtain that|
Consequently, is well defined. Next, we prove that is a homomorphism. Let be a -ary relation and let be a tuple in this relation. Denote for all ; then, has to be in for all . Further, we know that there exists a tuple , because is not empty. Since is a homomorphism, the tuple
is in . Thus, is a homomorphism from to .
There exists a structure having all constants such that is solvable by PAC but not by LAAC nor AC.
Proof. Let us consider the structure with universe over the signature where
First we show that there is no homomorphism such that for all . Let us assume there is one. Since and the tuple , which is equal to , has to be contained in . Thus, cannot be equal to . On the other hand, and implies that is in . Therefore, has to be , which is a contradiction. This establishes that the structure is not solvable by LAAC; by Proposition 15, it follows that the structure is not solvable by AC.
Next we show that for all , there exists a homomorphism from to . Let be arbitrary and let be an arbitrary -tuple of . Further, let be the minimal number such that is , , or ; if such an does not exists, then . The homomorphism can be defined as follows:
Let us verify that is indeed a homomorphism: First of all, it is easy to see that is in whenever is in . Next, let us consider . Let and be arbitrary -tuples of such that is in for all . Let be the minimal number such that is , , or , and let be the minimal number such that is , , or , and if such an or does not exists, then or respectively. If , then has to be , , or and hence . Symmetrically, if , then . Therefore, . Now, if , then , which is in ; if , then follows directly from being in . Finally, let us consider two arbitrary -tuples and of such that is in for all . If , then or and cannot be in . If , then is in and, thus, in . If , then let be an index such that . Such an index has to exist, because is a tuple of . Since is in , has to be , and hence , and we appeal to one of the first two cases. The remaining case is and . In this case, is in and therefore in .
We now move on to study SAC; we show that SAC is strictly more powerful than PAC.