Strengthening neighbourhood substitution

07/13/2020
by   Martin C. Cooper, et al.
IRIT
0

Domain reduction is an essential tool for solving the constraint satisfaction problem (CSP). In the binary CSP, neighbourhood substitution consists in eliminating a value if there exists another value which can be substituted for it in each constraint. We show that the notion of neighbourhood substitution can be strengthened in two distinct ways without increasing time complexity. We also show the theoretical result that, unlike neighbourhood substitution, finding an optimal sequence of these new operations is NP-hard.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

03/26/2020

No-Rainbow Problem is NP-Hard

Surjective Constraint Satisfaction Problem (SCSP) is the problem of deci...
03/26/2020

No-Rainbow Problem and the Surjective Constraint Satisfaction Problem

Surjective Constraint Satisfaction Problem (SCSP) is the problem of deci...
04/21/2021

QCSP on Reflexive Tournaments

We give a complexity dichotomy for the Quantified Constraint Satisfactio...
03/08/2011

On Minimal Constraint Networks

In a minimal binary constraint network, every tuple of a constraint rela...
01/04/2012

Complexity Classification in Infinite-Domain Constraint Satisfaction

A constraint satisfaction problem (CSP) is a computational problem where...
04/15/2010

Propagating Conjunctions of AllDifferent Constraints

We study propagation algorithms for the conjunction of two AllDifferent ...
08/27/2017

Plain stopping time and conditional complexities revisited

In this paper we analyze the notion of "stopping time complexity", infor...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Domain reduction is classical in constraint satisfaction. Indeed, eliminating inconsistent values by what is now known as arc consistency [Waltz] predates the first formulation of the constraint satisfaction problem [DBLP:journals/ai/Mackworth77]. Maintaining arc consistency, which consists in eliminating values that can be proved inconsistent by examining a single constraint together with the current domains of the other variables, is ubiquitous in constraint solvers [DBLP:reference/fai/Bessiere06]. In binary CSPs, various algorithms have been proposed for enforcing arc consistency in time, where denotes maximum domain size and the number of constraints [DBLP:journals/ai/MohrH86, ac]. Generic constraints on a number of variables which is unbounded are known as global constraints. Arc consistency can be efficiently enforced for many types of global constraints [DBLP:reference/fai/HoeveK06]. This has led to the development of efficient solvers providing a rich modelling language. Stronger notions of consistency have been proposed for domain reduction which lead to more eliminations but at greater computational cost [DBLP:reference/fai/Bessiere06, DBLP:conf/ijcai/BessiereD05, DBLP:conf/cp/WoodwardKCB12].

In parallel, other research has explored methods that preserve satisfiability of the CSP instance but do not preserve the set of solutions. When searching for a single solution, all but one branch of the explored search tree leads to a dead-end, and so any method for faster detection of unsatisfiability is clearly useful. An important example of such methods is the addition of symmetry-breaking constraints [DBLP:journals/constraints/ChuS15, DBLP:reference/fai/GentPP06]. In this paper we concentrate on domain-reduction methods. One family of satisfiability-preserving domain-reduction operations is value merging. For example, two values can be merged if the so-called broken triangle (BT) pattern does not occur on these two values [DBLP:journals/ai/CooperDMETZ16]. Other value-merging rules have been proposed which allow less merging than BT-merging but at a lower cost [vi] or more merging at a greater cost [wBTP, DMTCS:Naanaa]. Another family of satisfiability-preserving domain-reduction operations are based on the elimination of values that are not essential to obtain a solution [DBLP:conf/gcai/FreuderW17]. The basic operation in this family which corresponds most closely to arc consistency is neighbourhood substitution: a value can be eliminated from a domain if there is another value in the same domain such that can be replaced by in each tuple in each constraint relation (reduced to the current domains of the other variables) [DBLP:conf/aaai/Freuder91]. In binary CSPs, neighbourhood substitution can be applied until convergence in time [ns]. In this paper, we study notions of substitutability which are strictly stronger than neighbourhood substitutability but which can be applied in the same time complexity. We say that one elimination rule is stronger than (subsumes) another rule if any value in a non-trivial instance (an instance with more than one variable) that can be eliminated by can also be eliminated by , and is strictly stronger (strictly subsumes) if there is also at least one non-trivial instance in which can eliminate a value that cannot. Two rules are incomparable if neither is stronger than the other.

To illustrate the strength of the new notions of substitutability that we introduce in this paper, consider the instances shown in Figure 1. These instances are all globally consistent (each variable-value assignment occurs in a solution) and neighbourhood substitution is not powerful enough to eliminate any values. In this paper, we introduce three novel value-elimination rules, defined in Section 2: SS, CNS and SCSS. We will show that snake substitution (SS) allows us to reduce all domains to singletons in the instance in Figure 1(a). Using the notation for the domain of the variable , conditioned neighbourhood-substitution (CNS), allows us to eliminate value 0 from and value 2 from in the instance shown in Figure 1(b), reducing the constraint between and to a null constraint (the complete relation ). Snake-conditioned snake-substitution (SCSS) subsumes both SS and CNS and allows us to reduce all domains to singletons in the instance in Figure 1(c) (as well as in the instances in Figure 1(a),(b)).

(a)(b)(c)
Figure 1: (a) A 4-variable CSP instance over boolean domains; (b) a 3-variable CSP instance over domains with constraints , and ; (c) A 4-variable CSP instance over domain with constraints , , , , and .

In Section 2 we define the substitution operations SS, CNS and SCSS. In Section 3 we prove the validity of these three substitution operations, in the sense that they define satisfiability-preserving value-elimination rules. In Section 4 we explain in detail the examples in Figure 1 and we give other examples from the semantic labelling of line drawings. Section 5 discusses the complexity of applying these value-elimination rules until convergence: the time complexity of SS and CNS is no greater than neighbourhood substitution (NS) even though these rules are stictly stronger. However, unlike NS, finding an optimal sequence of value eliminations by SS or CNS is NP-hard: this is shown in Section 6.

2 Definitions

We study binary constraint satisfaction problems.

A binary CSP instance comprises

  • a set of variables ,

  • a domain for each variable (), and

  • a binary constraint relation for each pair of distinct variables ()

For notational convenience, we assume that there is exactly one binary relation for each pair of variables. Thus, if and do not constrain each other, then we consider that there is a trivial constraint between them with . Furthermore, (viewed as a boolean matrix) is always the transpose of . A solution to is an -tuple such that , and for each distinct , .

We say that has a support at variable if such that . A binary CSP instance is arc consistent (AC) if for all pairs of distinct variables , each has a support at  [lecoutre].

In the following we assume that we have a binary CSP instance over variables and, for clarity of presentation, we write as a shorthand for . We use the notation for

(i.e. can be substituted for in any tuple ).

Definition 1

[DBLP:conf/aaai/Freuder91]   Given two values , is neighbourhood substitutable (NS) by if , .

Figure 2: An illustration of the definition of .

It is well known and indeed fairly obvious that eliminating a neighbourhood substitutable value does not change the satisfiability of a binary CSP instance. We will now define stronger notions of substitutability. The proofs that these are indeed valid value-elimination rules are not directly obvious and hence are delayed until Section 3. We use the notation for

This is illustrated in Figure 2, in which ovals represent domains, bullets represent values, a line joining two values means that these two values are compatible (so, for example, ), and the means that . Since in this definition is a function of and , if necessary, we will write instead of . In other words, the notation means that can be substituted for in any tuple provided we also replace by . It is clear that implies since it suffices to set since, trivially, for all . In Figure 1(a), the value is snake substitutable by : we have by taking (where the arguments of are as shown in Figure 2), since and ; and since . Indeed, by a similar argument, the value is snake substitutable by in each domain.

Definition 2

Given two values , is snake substitutable (SS) by if , .

Figure 3: An illustration of the definition of conditioned neighbourhood-substitutability of by (conditioned by ).

In the following two definitions, can be eliminated from because it can be substituted by some other value in , but this value is a function of the value assigned to another variable . Definition 3 is illustrated in Figure 3.

Definition 3

Given , is conditioned neighbourhood-substitutable (CNS) if for some , with , such that .

A CNS value is substitutable by a value where is a function of the value assigned to some other variable . In Figure 1(b), the value is conditioned neighbourhood-substitutable (CNS) with as the conditioning variable (i.e. in Definition 3): for the assignments of or to , we can take since , and for the assignment to , we can take since . By a symmetrical argument, the value is CNS, again with as the conditioning variable. We can note that in the resulting CSP instance, after eliminating from and from , all domains can be reduced to singletons by applying snake substitutability.

Observe that CNS subsumes arc consistency; if a value has no support in , then is trivially CNS (conditioned by the variable ). It is easy to see from their definitions that SS and CNS both subsume NS (in instances with more than one variable), but that neither NS nor SS subsume arc consistency.

We now integrate the notion of snake substitutability in two ways in the definition of CNS: the value (see Figure 3) assigned to a variable may be replaced by a value (as in the definition of , above), but the value (see Figure 3) assigned to the conditioning variable may also be replaced by a value . This is illustrated in Figure 4.

Figure 4: An illustration of snake-conditioned snake-substitutability of by .
Definition 4

A value is snake-conditioned snake-substitutable (SCSS) if for some , with , such that .

In Figure 1(c), the value is snake-conditioned snake-substitutable (SCSS) with as the conditioning variable: for the assignment of or to , we can take since (taking for ) and (taking for ), and for the assignment of to , we can take since (again taking for ) and (again taking for ). By similar arguments, all domains can be reduced to singletons following the SCSS elimination of values in the following order: from , , and from , , and from , , and from and from .

We can see that SCSS subsumes CNS by setting in Definition 4 and by recalling that implies that . It is a bit more subtle to see that SCSS subsumes SS: if is snake substitutable by some value , it suffices to choose in Definition 4 to be this value (which is thus constant, i.e. not dependent on the value of ), then the snake substitutability of by implies that for all and , which in turn implies that for ; thus is snake-conditioned snake-substitutable.

3 Value elimination

It is well-known that NS is a valid value-elimination property, in the sense that if is neighbourhood substitutable by then can be eliminated from without changing the satisfiability of the CSP instance [DBLP:conf/aaai/Freuder91]. In this section we show that SCSS is a valid value-elimination property. Since SS and CNS are subsumed by SCSS, it follows immediately that SS and CNS are also valid value-elimination properties.

Theorem 3.1

In a binary CSP instance , if is snake-conditioned snake-substitutable then can be eliminated from without changing the satisfiability of the instance.

Proof

By Definition 4, for some , with , such that

(1)
(2)

We will only apply this definition for fixed , and for fixed values and , so we can consider as a constant (even though it is actually a function of ). Let be a solution to with . It suffices to show that there is another solution with . Consider . Since is a solution, we know that . Thus, according to the above definition of SCSS, there is a value that can replace (conditioned by the assignment ) in the sense that (1) and (2) are satisfied. Now, for each , , i.e.

Recall that is a function of and . But we will only consider fixed and a unique value of dependant on , so we will write for brevity. Indeed, setting we can deduce from (since is a solution) that

(3)

Define the -tuple as follows:

Clearly and for all . To prove that is a solution, it remains to show that all binary constraints are satisfied, i.e. that for all distinct . There are three cases: (1) , , (2) , , (3) .

  • There are three subcases: (a) and , (b) and , (c) and . In case (a), and , so from equation 2, we have . In case (b), and and so, trivially, . In case (c), and , so from equation 3, we have .

  • There are four subcases: (a) and , (b) and , (c) and , (d) and . In case (a), and , so since is a solution. In case (b), and ; setting , in equation 3, we have since . In case (c), and ; setting and in equation 2 we can deduce that since . In case (d), and . By the same argument as in case 2(b), we know that , and then setting and in equation 2, we can deduce that .

  • There are three essentially distinct subcases: (a) and , (b) and , (c) and . In cases (a) and (b) we can deduce by the same arguments as in cases 2(a) and 2(b), above. In case (c), and . Setting in equation 3, we have from which we can deduce that since . Reversing the roles of and in equation 3 (which is possible since they are distinct and both different to and ), we also have that . We can then deduce that since we have just shown that .

We have thus shown that any solution with can be transformed into another solution that does not assign the value to and hence that the elimination of from preserves satisfiability.

Corollary 1

In a binary CSP instance , if is snake-substitutable or conditioned neighbourhood substitutable, then can be eliminated from without changing the satisfiability of the instance.

4 Examples

We illustrate the potential of SS, CNS and SCSS using the examples given in Figure 1. In Figure 1(a), the value is snake substitutable by : we have by taking (where the arguments of are as shown in Figure 2), since and ; and since . Indeed, by a similar argument, the value is snake substitutable by in each domain. In Figure 1(b), the value is conditioned neighbourhood-substitutable (CNS) with as the conditioning variable (i.e. in Definition 3): for the assignments of or to , we can take since , and for the assignment to , we can take since . By a symmetrical argument, the value is CNS, again with as the conditioning variable. We can note that in the resulting CSP instance, after eliminating from and from , all domains can be reduced to singletons by applying snake substitutability.

In Figure 1(c), the value is snake-conditioned snake-substitutable (SCSS) with as the conditioning variable: for the assignment of or to , we can take since (taking for ) and (taking for ), and for the assignment of to , we can take since (again taking for ) and (again taking for ). By similar arguments, all domains can be reduced to singletons following the SCSS elimination of values in the following order: from , , and from , , and from , , and from and from .

Figure 5: The six different types of trihedral vertices: , , , , , .
?
Figure 6: The catalogue of labelled junctions that are projections of trihedral vertices.
Figure 7: An example from a family of line drawings whose exponential number of labellings is reduced to one by snake substitution.

To give a non-numerical example, we considered the impact of SS and CNS in the classic problem of labelling line-drawings of polyhedral scenes composed of objects with trihedral vertices [DBLP:journals/ai/Clowes71, Huff1, Waltz]. There are six types of trihedral vertices: , , , , and , shown in Figure 5. The aim is to assign each line in the drawing a semantic label among four possibilities: convex (), concave () or occluding ( or depending whether the occluding surface is above or below the line). Some lines in the top middle drawing in Figure 5 have been labelled to illustrate the meaning of these labels. This problem can be expressed as a binary CSP by treating the junctions as variables. The domains of variables are given by the catalogue of physically realisable labellings of the corresponding junction according to its type. This catalogue of junction labellings is obtained by considering the six vertex types viewed from all possible viewpoints [DBLP:journals/ai/Clowes71, Huff1]. For example, there are 6 possible labellings of an L-junction, 8 for a T-junction, 5 for a Y-junction and 3 for a W-junction [ldbook]. The complete catalogue of labelled junctions is shown in Figure 6, where a question mark represents any of the four labels and rotationally symmetric labellings are omitted. There is a constraint between any two junctions joined by a line: this line must have the same semantic label at both ends. We can also apply binary constraints between distant junctions: the 2Reg constraint limits the possible labellings of junctions such as and in Figure 7, since two non-colinear lines, such as and , which separate the same two regions cannot both be concave [lddl, ldbook].

Figure 8: An example from a family of line drawings whose exponential number of labellings is reduced to one by snake substitution.

The drawings shown in Figure 7 and Figure 8 are ambiguous. For example, in Figure 7, any of lines , or could be projections of concave edges (meaning that the two blocks on the left side of the figure are part of the same object) or all three could be projections of occluding edges (meaning that these two blocks are, in fact, separate objects). Similarly, lines and in Figure 8 could be projections of occluding or concave edges. The drawings shown in Figure 7 and Figure 8 are both examples of families of line drawings. In each of these figures there are four copies of the basic structure, but there is a clear generalisation to drawings containing copies of the basic structure. The ambiguity that we have pointed out above gives rise to an exponential number of valid labellings for these families of drawings. However, after applying arc consistency and snake substitution until convergence, each domain is a singleton for both these families of line drawings. We illustrate this by giving one example of a snake substitution. After arc consistency has been established, the labelling for junction in Figure 7 is snake substitutable by . This can be seen by consulting Figure 9 which shows the domains of variables , , , and with lines joining compatible labellings for adjacent junctions: snake substitutability follows from the fact that the labelling for can be replaced by in any global labelling, provided the labelling for is also replaced by and the labelling for is also replaced by . Purely for clarity of presentation, some constraints have not been shown in Figure 9, notably the 2Reg constraints between distant junctions [lddl, ldbook].

Of course, there are line drawings where snake substution is much less effective than in Figures 7 and 8. Nevertheless, in the six drawings in Figure 5, which are a representative sample of simple line drawings, 22 of the 73 junctions have their domains reduced to singletons by arc consistency alone and a further 20 junctions have their domains reduced to singletons when both arc consistency and snake substitution are applied. This can be compared with neighbourhood substitution which eliminates no domain values in this sample of six drawings. It should be mentioned that we found no examples where conditioned neighbourhood substitution led to the elimination of labellings in the line-drawing labelling problem.

Figure 9: A close-up view of the variables corresponding to junctions , , , and in Figure 7.

5 Complexity

In a binary CSP instance , we say that two variables constrain each other if there is a non-trivial constraint between them (i.e. ). Let denote the set of pairs such that constrain each other. We use to denote the maximum size of the domains and to denote the number of non-trivial binary constraints. In this section we show that it is possible to apply CNS and SS until convergence in time and that it is possible to check SCSS in time. Thus, the complexity of applying the value-elimination rules CNS, SS and SCSS is comparable to the time complexity of applying neighbourhood substitution (NS) [ns]. This is interesting because (in instances with more than one variable) CNS, SS and SCSS all strictly subsume NS.

5.1 Substitution and arc consistency

It is well known that arc consistency eliminations can provoke new eliminations by neighbourhood substitution (NS) but that NS eliminations cannot destroy arc consistency [ns]. It follows that arc consistency eliminations can provoke new eliminations by SS, CNS and SCSS (since these notions subsume NS). It is easily seen from Definition 3 that eliminations by CNS cannot destroy arc consistency. We therefore assume in this section that arc consistency has been established before looking for eliminations by any form of substitution. Nonetheless, unlike CNS, eliminations by SS (or SCSS) can provoke new eliminations by arc consistency; however, these eliminations cannot themselves propagate. To see this, suppose that is eliminated since it is snake-substitutable by . If is the only support of at , then can then be eliminated by arc consistency. However, the elimination of cannot provoke any new eliminations by arc consistency. To see this, recall that, by Definition 2 of SS, there is a value such that for all , for all , if was a support for at then so was (as illustrated in Figure 2). Furthermore, since was the only support for at , no other value in can lose its support when is eliminated from . In conclusion, the algorithm for applying SS has to apply this limited form of arc-consistency (without propagation) whereas the algorithm to apply CNS does not need to test for arc consistency since we assume that it has already been established. Furthermore, since AC is, in fact, subsumed by SCSS we do not explicitly need to test for it in the algorithm to apply SCSS.

5.2 Applying SS until convergence

We first give an algorithm for applying SS until convergence. The following notions used by the algorithm are best understood by consulting Figure 2. An assignment that contradicts some is known as a block, and the associated variable is known as a block variable. For , the term sub denotes a value that could replace when is assigned (in the sense that and , as illustrated in Figure 2). In the context of a possible snake substitution of by , a value that does not have a sub that could replace it is a stop and the corresponding variable is a stop variable. If there are no stop variables, then is snake substitutable by . The algorithm for applying SS until convergence uses the following data structures:

  • For all , for all ,
    NbBlocks() .

  • For all , for all ,
    BlockVars() .
    If BlockVars( , for some , then can be eliminated from by neighbourhood substitution.

  • For all , for all , for all such that ,
    NbSubs() .
    Note that if and only if .

  • For all , for all , NbStops()
    .

  • For all , for all ,
    NbStopVars() .

  • For all , for all ,
    NbSnake() .
    If NbSnake() , then value can be eliminated from by snake substitution.

  • For all , for all ,
    Inconsistent() true if has no support at some other variable.

We assume, throughout this section, that each set (such as BlockVars()) which is a subset of a fixed set (in this case the set of variables) is stored using an array data structure whose index ranges over the elements of thus allowing all basic operations, such as insertion and deletion, to be performed in time. The data structures listed above can clearly be initialised in time and require space. We also use a list data structure ElimList to which we add pairs if NbSnake() (indicating that could be eliminated by snake substitition). Note that due to possible eliminations of other values, from the same or other domains, between the detection of the snake substitutability of

and the moment this information is processed,

may no longer be snake substitutable when ElimList is processed. We choose to give precedence to neighbourhood substitutability over snake substitutability by placing neighbourhood-substitutable values at the head of the list ElimList (line (1) in the code below) and snake-substitutable values at the tail. We also add to ElimList values that can be eliminated by arc consistency.

The processing of elements of ElimList involves the updating of all the above data structures which can in turn lead to the detection of new eliminations. The algorithm given in Figure 10 performs eliminations and propagates until convergence, assuming that the above data structures (including ElimList) have all been initialised. When a value is eliminated from we have to update NbBlocks because may correspond to value in Figure 2, NbSubs because may correspond to value in Figure 2, NbStops because may correspond to the value in Figure 2, NbSnake because may correspond to the value in Figure 2, and Inconsistent when corresponds to value in Figure 2.

while ElimList : pop from ElimList ; if and (NbSnake() or Inconsistent()) : delete from ; *** Update NbBlocks and propagate *** for all such that : for all such that and : NbBlocks() := NbBlocks() ; if NbBlocks() then delete from BlockVars() ; if BlockVars() becomes then   *** NS *** add to head of ElimList ; …………………………………(1) if BlockVars() becomes a singleton then for all such that and : NbSubs() := NbSubs() ; ………..(2) if NbSubs() then DecStops() ; for all such that : *** Update NbSubs and propagate *** for all such that : if and BlockVars() then …………………(3) NbSubs() := NbSubs() ; if NbSubs() then for all such that : IncStops() ; *** Update NbStops and propagate *** for all such that and and NBSubs() : DecStops() ;  …………………………………………………….(4) *** Update Inconsistent *** for all : if has no support at then   *** not AC *** Inconsistent() := true ;   add to head of ElimList ; *** Update NbSnake *** for all such that NbStopVars() : NbSnake() := NbSnake() ;

Figure 10: The propagation algorithm for applying SS until convergence.

The subprograms DecStops and IncStops for updating NbStops (and the consequent updating of NbStopVars and NbSnake) are given in Figure 11.

procedure DecStops() : NbStops() := NbStops()