Improving MUC extraction thanks to local search

07/12/2013 ∙ by Éric Grégoire, et al. ∙ 0

ExtractingMUCs(MinimalUnsatisfiableCores)fromanunsatisfiable constraint network is a useful process when causes of unsatisfiability must be understood so that the network can be re-engineered and relaxed to become sat- isfiable. Despite bad worst-case computational complexity results, various MUC- finding approaches that appear tractable for many real-life instances have been proposed. Many of them are based on the successive identification of so-called transition constraints. In this respect, we show how local search can be used to possibly extract additional transition constraints at each main iteration step. The approach is shown to outperform a technique based on a form of model rotation imported from the SAT-related technology and that also exhibits additional transi- tion constraints. Our extensive computational experimentations show that this en- hancement also boosts the performance of state-of-the-art DC(WCORE)-like MUC extractors.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In this paper, the focus is on unsatisfiable constraint networks. More precisely, a new approach for extracting minimal cores (or, s for Minimally Unsatisfiable Cores) of constraint networks is proposed. A is a minimal (w.r.t. ) set of constraints that cannot be satisfied all together. When causes of unsatisfiability must be understood and the network must be re-engineered and relaxed to become satisfiable, extracting s can be a cornerstone issue since a provides one explanation for unsatisfiability in terms of a minimal set of incompatible constraints. Despite bad worst-case computational complexity results, various approaches for extracting one have been proposed that appear tractable for many instances [8, 2, 21, 20, 18, 19, 17, 15, 14].

A of a network can also be defined as an unsatisfiable sub-network formed of transition constraints, which are constraints that would allow this sub-network to become satisfiable if any of them were removed. Powerful approaches to extraction are founded on transition constraints, both in the csp [17, 14] and the sat [23, 12, 13, 33, 23, 28, 4] domains. In this last area, a recent approach [32, 5] focuses on the following intuition. An assignment of values to the variables that satisfies all constraints except one is called a transition assignment and the unsatisfied constraint is a transition constraint: additional transition constraints might be discovered by so-called model rotation, i.e., by examining other assignments differing from the transition assignment on the value of one variable, only.

In the paper, an approach both extending and enhancing this latter technique is proposed in the constraint network framework. The main idea is to use local search for exploring the neighborhood of transition assignments in an attempt to find out other transition constraints. The technique is put in work in a so-called dichotomy destructive strategy à la dc(wcore) [17] to extract one . Extensive computational experimentations show that this approach outperforms both the model rotation technique from [5] and the performance of state-of-the-art extractors.

The paper is organized as follows. In the next section, basic concepts, definitions and notations are provided. Section 3 focuses on existing techniques for extraction, including dc(wcore)-like ones, and then on model rotation. In section 4, a local search procedure for exhibiting additional transition constraints is presented and motivated, whereas section 5 describes the full algorithm for extraction. Section 6 describes our experimental investigations and results before some promising paths for further research are presented in the conclusion.

2 Definitions and Notations

Constraint networks are defined as follows.

Definition 1 (Constraint network)

A constraint network (cn) is a pair , where

  1. is a finite set of variables s.t. each variable has an associated finite instantiation domain, denoted ;

  2. is a finite set of constraints s.t. each constraint involves a subset of variables of , called scope and denoted . translates an associated relation that contains all the values for the variables of that satisfy .

A constraint network where the scope of all constraints is binary can be represented by a non-oriented graph where each variable is a node and each constraint is an edge.

(a) constraint network of Example 1.

(b) one in Example 1.
Figure 1: Graphical representation of Example 1.
Example 1

Let where each variable has the same domain and let be a set of constraints. The constraint network can be represented by the graph of Fig. 1(a).

Definition 2 (Assignment and solution)

An assignment of a constraint network is an assignment of values to all variables of . A solution to is any assignment that satisfies all constraints of .

A form of Constraint Satisfaction Problem (csp) consists in checking whether a constraint network admits at least one solution. This decision problem is an NP-complete problem. If admits at least one solution then is called satisfiable else is called unsatisfiable. The constraint network of Example 1 is unsatisfiable. When a constraint network is unsatisfiable, it admits at least one Minimal (w.r.t. ) Unsatisfiable Core (in short, ).

Definition 3 (Core and )

Let be a constraint network.
is an unsatisfiable core, in short a core, of iff

  • is an unsatisfiable constraint network, and

  • and .

is a Minimal Unsatisfiable Core () of iff

  • is a core of , and

  • there does not exist any proper core of : is satisfiable.

Example 1. (cont’d) is unsatisfiable and admits only one , illustrated in Fig. 1(b).

Whenever is unsatisfiable, exhibits at least one . In the worst case, there can be a number of different s that is exponential in the number of constraints of (actually it is in . Note that different s of a same network can share constraints. Accordingly, all s need not be extracted in a step-by-step relaxation process to make the network become satisfiable. Especially, such an iterative process where at each step one is extracted and relaxed so that it becomes satisfiable, at most s need to be extracted.

Example 2

Fig. 2 depicts an unsatisfiable constraint network with three s, namely , and . In this example, each variable is given the same domain .

Figure 2: An unsatisfiable constraint network that contains 3 s.

Extracting one from an unsatisfiable constraint network is a heavy computational task in the worst case. Indeed, checking whether a constraint network is a is DP-complete [27]. Despite the aforementioned bad worst-case computational complexity property, various families of approaches that run in acceptable time for many instances have been proposed. We describe representatives of some of the main ones in the next section.

3 Extraction

Most recent approaches for extracting one from an unsatisfiable constraint network start by computing one core (which does not need to be minimal) of . This step can be optional because an unsatisfiable constraint network is already a core. During this step some information can however be collected that will help guiding the further minimization step. In this paper, we focus on this minimization step and make use of the wcore core extractor introduced in [17] as a preprocessing step, that we briefly describe hereafter.

3.1 wcore as a pre-processing step

When the unsatisfiability of a constraint network is proved thanks to a filtering search algorithm, wcore [17] delivers a core of that is formed of all the constraints that have been involved in the proof of unsatisfiability, namely all the constraints that have been used during the search to remove by propagation at least one value from the domain of any variable. Such constraints are called active. Therefore, when is shown unsatisfiable, active constraints form a core since the other constraints do not actually take part to this proof of inconsistency. The approach from [17] iterates this process until no smaller set of active constraints is found. Consequently, at the end of this first step, constraints that are not active can be removed safely while keeping a remaining constraint network that is unsatisfiable.

Clearly, the resulting core can depend on the order according to which the partial assignments are investigated, which is guided by some branching heuristic.

wcore takes advantage of the powerful dom/wdeg heuristic [16] (see also variants in e.g. [14]), which consists in attaching to each constraint a counter initialized to and that is incremented each time the corresponding constraint is involved in a conflict. In this respect, dom/wdeg

selects the variable with the smallest ratio between the current domain size and a weighted degree, which is defined as the sum of the counters of the constraints in which the variable is involved. This heuristic allows to focus on constraints that appear difficult to satisfy. The goal is not only to attempt to ease the search for inconsistency but also to record some indication that these constraints are probably prone to belong to a . Accordingly, it is proposed in

[17] to weigh the constraints via the dom/deg heuristic and use the wcore approach as a preprocessing step for extraction to attempt to reduce the size of the core. Likewise, our approach reuses this weighing information in the subsequent steps of the algorithm to compute one .

3.2 Minimization step

Once a core has been extracted from a constraint network, it must be minimized so that if forms one . To this end, it might be necessary to check whether a constraint belongs or not to the set of s included within a core, which is a task in [9]. In practice, this step is often based on the identification of forms of transition constraints.

Definition 4 (Transition constraint)

Let an unsatisfiable constraint network. is a transition constraint of iff there exists an assignment of such that is a model of .

Example 1 (cont’d) Consider again Fig. 1(a). is a transition constraint. Indeed is unsatisfiable and is a solution of .

The following property is straightforward and directly follows from the definition of transition constraints.

Property 1

If is a transition constraint of a core then belongs to any of .

Clearly, all s of a core do not necessarily share a non-empty intersection and a constraint network might thus have no transition constraints. Actually, the process of finding out transition constraints is performed with respect to some subparts of the network adopting e.g. either so-called destructive or constructive approaches [8, 2, 20, 18, 17, 14]. For example, the constructive approaches (as in [8]) successively insert constraints taken from the core into a set of constraints until this latter set becomes unsatisfiable. At the opposite, destructive approaches [2] successively remove constraints from the initial core until the current network becomes satisfiable. Constraints are ordered and each time a transition constraint is discovered, it is placed at the beginning of the core according to this order. All constraints are tested according to the inverse order. It is also possible to use a dichotomy strategy in order to find out transition constraints [17]. Variants and combinations of these techniques can be traced back to e.g. QuickXplain [19, 24] and the combined approach [14].

Clearly, the order according to which the constraints are tested is critical for the efficiency of each approach. This order can be set according to the weighs of constraints computed during the wcore step. In the rest of the paper, we focus on a dichotomy destructive approach, which will be presented in more detail later on. Before that, let us briefly present a method that has been recently proposed in the sat research community to find more than one transition constraint at each main iteration of a extraction algorithm (in the sat domain, a is called a mus for Minimal Unsatisfiable Subformula).

3.3 Recursive Model Rotation

The model rotation approach (mr) has been introduced in [31]. It is based on the transition assignment concept.

Definition 5 (Transition assignment)

Let be a core. An assignment of is a transition assignment of iff falsifies only one constraint in .

Property 2

Let be a core and . is a transition constraint of iff there exists an transition assignment of that falsifies .

The proof is straightforward since is a transition constraint of iff is unsatisfiable and there exists a solution of iff falsifies only the constraint of .

When a transition assignment is found, the model rotation approach explores assignments that differ from w.r.t. only one value. If this close assignment also falsifies only one constraint , then belongs to every , too.

In [5], it is proposed to recursively perform model rotation. This extended technique is called Recursive Model Rotation: it is summarized in a csp version in Algorithm 1. This algorithm always makes local changes to the value of one variable in the transition assignment in trying to find out another transition assignment exhibiting another constraint (lines 3–5), without a call to a constraint network solver. Contrary to the initial model rotation technique, the process is not stopped when a transition assignment does not deliver an additional transition constraint. Instead, model rotation is recursively performed with all transition assignments found (lines 6–8). See e.g., [4] [3] and [30] for more on the use of model rotation to extract muses.

Input: : an unsatisfiable cn,
        : a set of constraints belonging to every of ,
        : a transition assignment of
Output: expanded
the only constraint falsified by /* transition constraint */
foreach do  foreach do  where the variable is assigned to ; AlgoLine0.2
if is a transition assignment of then  Let be the transition constraint associated to w.r.t. ; AlgoLine0.3
if then Recursive-mr(, , );AlgoLine0.4
      return ;AlgoLine0.5
Algorithm 1 Recursive-mr (mr stands for Model Rotation)





4 Local Search for Transition Constraints

In the following, we introduce a new approach for computing one by means of exhibiting transition constraints that relies on stochastic local search (in short, sls) as a kernel procedure. Whenever sls reaches an assignment that falsifies exactly one constraint , is a transition constraint and belongs to the final . Obviously, such an assignment is a local minima for sls, which can then explore neighborhood assignments, including other possible transition assignments that would be discovered by recursive model rotation. Hence, we have investigated a generic approach based on sls that we call Local Search for Transition Constraints, in short lstc.

Local search in the sat and csp domains is usually implemented as a tool intended to search for a model. On the contrary, we make use of sls to explore the neighborhood of a transition assignment in search for additional transition constraints. Accordingly, some adaptations were made to the usual sls scheme. Although its escape strategy is close to the so-called breakout method [26], the end criterion was modified in order to allow SLS to focus and stress on parts of the search space that are expectedly very informative, as proposed in [11, 1]. More precisely, the counter of iterations remaining to be performed is increased in a significant way each time an additional transition constraint has been discovered, as SLS might have reached a promising part of the search space that has not been explored so far. On the contrary, when an already discovered transition constraint is found again, the counter is decreased by the number of times has already been considered, as a way to guide SLS outside expectedly well-explored parts of the search space. Otherwise, the counter is decremented at each step and the procedure ends when becomes negative.

The objective function was itself modified to enforce the satisfaction of the constraints already identified as belonging to the that will be exhibited. More precisely, these latter constraints have their weighs increased in order to be satisfied first.

Algorithm 2 summarizes the approach. It takes as input an unsatisfiable constraint network , an assignment and the current set of constraints that have already been recognized as belonging to the that will be exhibited. Note that in most calls to Algorithm 2, the parameter is a subpart of the constraint network for which a must be found; will represent the current result of a dichotomy destructive strategy in the calling procedure.

Input: : a cn,
        : a set of constraints belonging to every of ,
        : a transition assignment of (possibly empty)
Output: expanded
if then a random assignment of ;AlgoLine0.1
foreach do ;AlgoLine0.2
Initialize by a preset positive number ;  /* Counter nbIt of remaining
iterations is initialized. */
while do  if a local minimum is reached then  if is falsified by then /* transition assignment */  Let be the constraint falsified by /* transition constraint */
if then  ; AlgoLine0.3
Increase by a preset positive bonus ;AlgoLine0.4
else  ; AlgoLine0.5
    Change the value in of one var. of according to an escape strategy ;AlgoLine0.7
else  Change the value in of one var. of s.t. the sum of the weighs of violated constraints decreases;AlgoLine0.8
  return ;AlgoLine0.10
Algorithm 2 lstc (stands for Local Search for Transition Constraints)










Also, does not need to be a transition assignment. When is empty, it is randomly initialized, like in a classical sls procedure. Otherwise, is a transition assignment which is used as the starting point of the search. The algorithm returns after this set has been possibly extended by additional constraints also belonging to this . The local search is a standard basic random-walk procedure [29] where the objective function has thus been modified in order to take a specific weigh on each constraint into account. Note that these weighs are specific to the call to lstc and thus different from the counters delivered by the pre-processing step, which are used by the dichotomy strategy.

A local minimum of a sls algorithm is a state where there does not exist any assignment that can be reached by a single move of the local search and that would decrease the sum of the weighs of the falsified constraints. Each time a local minimum is reached, the method tries to collect information (lines 6–13). Thus, before applying an escape criterion (line 14), when there is only one constraint of that is falsified by the current assignment (i.e., when the current assignment is a transition assignment), must appear in the final . Two sub-cases are thus as follows. When does not already belong to , is inserted within (line 9) and the value of is increased (line 10). When already belongs to , a penalty under the form of a negative number is applied to (line 12) and the weigh of is incremented (line 13). In this way, the more a transition constraint is considered, the greater is the penalty. When SLS is not reaching a local minimum, the value of one variable of is changed in such a way that the sum of the weighs of the falsified constraints decreases (line 15). In both latter cases, the value of the counter is decreased at each loop (line 16). Finally, when reaches a strictly negative value, a set of constraints included in the final to be exhibited is returned (line 17).

Let us stress that Algorithm 2 without colorized lines (6 to 13) is a mere standard stochastic local search procedure.

Noticeably, this approach differs from [10, 13] where a different form of sls was used to extract muses. First, [10, 13] was dedicated to the Boolean case using specific features of the clausal Boolean framework: extending it to the general constraint networks setting while still obtaining acceptable running times for many real-life instances remains an open challenge. Second, it was used as a fast-preprocessing step to locate an upper-approximation of a mus. The role of lstc is different: this procedure will be called during the fine-tuning process of the approximation delivered by the pre-processing. Finally, [10] and [13] were based on the so-called critical clause concept to explore the search space, which is not generalized and adopted here in the general framework of constraint networks.

5 Dichotomy Core Extraction with Local Search

Input: : an unsatisfiable cn
Output: one of
/* Preprocessing step */
/* Set of constraints belonging to the */
/* Last transition assignment found */
choose constraints of 111The scores collected during the wcore step are used to rank-order constraints./* Set of constraints analyzed
for possible removal, selected according to
a dichotomy strategy */
while do  /* Usual mac algorithm is used */
if and /* A solution is found and more than one */
then  /* constraint has been removed */
choose constraints of 111The scores collected during the wcore step are used to rank-order constraints./* range of analyzed
constraints is reduced */
  else  if then /* No solution found, can be
removed from while the resulting remains unsat */
else  /* Solution found and */
/* A trans. const. has been found */
/* The last transition assignment is saved */
  lstc/* is extended by lstc */
choose constraints of 111AlgoLine0.1The scores collected during the wcore step are used to rank-order constraints.;
    return ;AlgoLine0.2
Algorithm 3 Dichotomy Core Extraction with Local Search (dc(wcore)+lstc)


Algorithm 3 summarizes an algorithm that computes one of an unsatisfiable constraint network , based on the dichotomy strategy and relying on the local search procedure to extract additional transition constraints. As a preprocessing step, wcore delivers in an unsatisfiable core of that is not guaranteed to be minimal (line 1). , the set of constraints that have already been recognized as belonging to the , is then initialized to the empty set (line 2). The lastly discovered transition assignment is initialized to the empty set (line 3). According to a dichotomy strategy is initialized with half the constraints of , themselves selected according to the scores collected during the wcore preprocessing step (line 4). A dichotomy-based loop is run until becomes empty. While there remain constraints in that have not yet been either removed from the candidate or inserted in this , the sub-network is solved and the solution is stored in (line 6). By convention, when is a model of , it is not empty. In this case and when at the same time the number of constraints belonging to is different from , no conclusion can be made with this set of constraints and is then refined according to the dichotomy strategy. When is empty, is unsatisfiable and the constraints of can be removed from while keeping the unsatisfiability of this latter set (line 9). Finally, when is not empty and contains only one constraint , is a transition constraint and will appear in the final result (line 10) while the transition assignment is recorded in .

Calls to lstc are performed at each iteration step when or , with the following parameters: the current constraint network , the set of constraints already identified as belonging to in construction and the complete interpretation . These calls are thus intended to find out additional transition constraints with respect to . Note that after the first main iterations in the loop of Algorithm 3 have allowed a first transition assignment to be found, all calls to lstc are made with the lastly discovered transition constraints as a parameter. Although there can thus exist several calls to lstc with the same transition constraint, it is important to note that evolves, forming another constraint network at each call. It is thus transmitted to the local search procedure together with an expectedly “good” interpretation to start with. When becomes an empty set, this means that all constraints of have been proved to belong to the , i.e., . Thus, at the end of the loop identified as forming one of is returned (line 14).

Importantly, Algorithm 3 is complete in the sense that it is delivers one for any unsatisfiable in finite time (but it is exponential-time in worst case scenarios, like all complete algorithms to find out one ).

Let us stress that this algorithm differs from the dichotomy destructive strategy dc(wcore) in [17] according to the colorized lines (namely lines 3, 11 and 12), only.

6 Experimental Results

In order to assess and compare the actual efficiency of the local-search approach with other methods, and in particular the model-rotation one, we have considered all the benchmarks from the last csp solvers competitions [6, 7],222The benchmarks are available at which include binary vs. non-binary, random vs. real-life, satisfiable vs. unsatisfiable csp instances. Among these instances, only the 772 benchmarks that were proved unsatisfiable in less than 300 seconds using our own C++  rclCsp 333 The executable is available at csp solver were considered. rclCsp implements MAC embedding AC3 and makes use of the variable ordering heuristic.

Three approaches to the extraction problem have been implemented with rclCsp as kernel and experimentally compared: (1) dc(wcore) [17], namely a dichotomy destructive strategy with wcore as a preprocessing step, without any form of model rotation or local search to find out additional transition constraints, (2) dc(wcore) + rec-mr and (3) dc(wcore)+lstc. For the latter approach, the counter was initialized to and the bonus was set to that value, too. Furthermore, [25] was used as local-search escape criterion and the advanced data structures proposed in [22] have been implemented. All three versions have been run on a Quad-core Intel XEON X5550 with 32GB of memory under Linux Centos 5. Time-out has been set to 900 seconds.

Figure 3: Number of instances for which a was extracted.

In Fig. 3, a cactus plot compares the three approaches in terms of the number of instances for which a was extracted, indicating the spent CPU time on the -axis. Several observations can be made. First, similarly to the sat setting for model rotation [5], recursive model rotation improves performance in the sense that dc(wcore)+rec-mr found a for 632 instances whereas dc(wcore) solved 609 instances, only. Then, local search in its turn improves recursive model rotation according to the same criterion: dc(wcore)+lstc solved 663 instances. In addition to solving 54 and 31 additional instances respectively, dc(wcore)+lstc appears to be more efficient in terms of CPU time and less demanding than the other competing approaches in terms of the number of calls to a csp solver for the more challenging instances.

In Fig. 4, pairwise comparisons between the approaches are provided. Each comparison between two methods is done in terms of CPU time (sub-figures (a) (c) and (e)) and of the number of calls to the MAC solver (sub-figures (b), (d) and (f)), successively. For each scatter, the -axis (resp. -axis) corresponds to the CPU time (resp. ) obtained by the method labelled on the same axis. Each dot (, ) thus gives the results for a given benchmark instance. Thus, dots above (resp. below) the diagonal represent instances for which the method labelled on the -axis is better (resp. worse) than the method labelled on the -axis. Points on the vertical (resp. horizontal) dashed line mean that the method labelled on the -axis (resp. -axis) did not solved the corresponding instance before timeout.

(a) dc(wcore) vs. dc(wcore)+rec-mr
(b) dc(wcore) vs. dc(wcore)+rec-mr
(c) dc(wcore) vs. dc(wcore)+lstc
(d) dc(wcore) vs. dc(wcore)+lstc
(e) dc(wcore)+rec-mr vs. dc(wcore)+lstc
(f) dc(wcore)+rec-mr vs. dc(wcore)+lstc
Figure 4: Pairwise comparisons of dc(wcore), dc(wcore)+rec-mr and dc(wcore)+lstc.

lsmr rmr

Figure 5: of the constraints in the discovered by rec-mr and lstc.

First, the figure 4(a) shows that dc(wcore)+rec-mr solves instances faster than dc(wcore). Then, Fig. 4(c) and 4(e) show that, most generally, dc(wcore)+lstc finds one faster than both dc(wcore) and dc(wcore)+rec-mr manage to do it, provided that the instance is difficult in the sense that extracting a requires more than 100 seconds. On easier instances, the additional computational cost of local search (and model rotation) has often a negative impact on the global computing time, but this latter one remains however very competitive. Fig. 4(b) (d) and (f) show the extent to which recursive model rotation and local search reduce the number of calls to a csp solver, and thus of calls to an NP-complete oracle, in order to find out one . Clearly, dc(wcore)+lstc often outperforms the other approaches in that respect. The observation of these three figures suggests that local search allows more transition constraints to be discovered by considering assignments in the neighborhood of the transition assignments. This intuition is confirmed by the experimental results reported in Fig. 5. In this latter figure, we give the percentage of the total size of the that has been found by rec-mr and lstc, respectively. It shows that lstc detects more transition constraints than rec-mr. Moreover, it shows that for almost all instances, more than half of the constraints in the are found thanks to local search when dc(wcore)+lstc is under consideration. This ability explains much of the performance gains obtained on difficult instances. Actually, local search detects the totality of the for many instances.

Finally, Tab. 1 reports the detailed results of each approach on a typical panel of instances from the benchmarks. The first four columns provide the name, numbers of constraints and variables of the instance, the number of remaining constraints after the preprocessing step, successively. Then, for each method, the CPU time, the size of the extracted (the s discovered by the various methods can differ) and the number of csp calls to find it are listed. In addition, for dc(wcore)+rec-mr (resp. dc(wcore)+lstc), the number (“by rot.”) of constraints of the detected by model rotation (resp. local search (“by LS”)) is provided. TO means time-out and the best computing time for each instance is shaded in grey. For example, s of a same size (94 constraints) were found for the cc-10-10-2 instance using each of the three methods. lstcwas best performing in extracting a in 28.28 seconds (vs. 49.38 and 55.36 seconds for the other methods). Note that all constraints in that were discovered through the lstc procedure.

Instances Prep. dc(wcore) dc(wcore)+rec-mr dc(wcore)+lstc
Name time size solver time size by solver time size by solver
(sec) calls (sec) rot. calls (sec) LS calls
radar-8-24-3-2 64 144 41 213.84 8 31 136.15 8 6 15 110.90 8 7 10
radar-9-28-4-2 81 168 47 62.39 2 12 43.86 2 1 10 50.28 2 2 8
qKnights-50-add 1235 55 1233 358.21 5 148 355.81 5 4 126 151.64 5 5 16
qKnights-50-mul 1485 55 1484 335.55 5 102 352.27 5 4 96 172.16 5 5 16
qKnights-25-mul 435 30 435 37.58 5 268 30.26 5 4 212 9.19 5 5 13
qKnights-25-add 310 30 309 22.88 5 127 22.38 5 4 110 9.49 5 5 14
qKnights-20-add 200 25 198 6.22 5 73 5.96 5 4 58 3.69 5 5 13
bdd-30 2713 21 1157 TO - - TO - - - 855.96 10 3 59
bdd-2 2713 21 1307 TO - - TO - - - 725.23 10 4 49
bdd-25 2713 21 1392 719.65 10 53 677.24 10 1 48 590.79 10 3 38
bdd-18 2713 21 994 TO - - TO - - - 824.97 10 6 46
bdd-6 2713 21 802 TO - - TO - - - 530.59 9 4 28
bdd-3 2713 21 1551 910.64 11 75 889.34 11 0 74 748.77 11 2 45
ssa-0432-003 738 435 594 TO - - TO - - - 801.17 307 301 265
graph2-f25.xml 2245 400 1364 19.48 43 841 17.18 43 10 702 20.48 43 41 197
qcp-10-67-10 822 100 511 TO - - TO - - - 15.69 93 93 13
qcp-10-67-11 822 100 413 11.29 49 776 7.86 49 43 446 1.99 49 49 18
ruler-34-9-a4 582 9 276 TO - - 802.14 36 26 171 364.23 36 36 12
ruler-17-7-a4 196 7 97 2.05 29 196 1.78 29 16 119 8.49 29 29 23
ruler-25-8-a4 350 8 152 24.19 28 182 21.69 28 21 76 20.08 28 28 9
cc-10-10-2 2025 100 381 55.36 94 579 49.38 94 45 291 28.28 94 94 20
cc-15-15-2 11025 225 902 275.42 92 572 229.33 92 47 272 173.37 92 92 30
cc-12-12-2 4356 144 333 137.13 93 571 101.32 93 47 271 54.49 93 93 22
cc-20-20-2 36100 400 1232 TO - - 755.35 92 43 297 563.57 92 92 27
ehi-85-297-14 4111 297 265 1.43 38 428 1.54 37 11 420 2.19 37 33 113
ehi-85-297-64 4113 297 182 1.39 38 427 1.06 37 13 319 1.86 35 28 105
ehi-85-297-98 4124 297 682 0.69 22 192 0.47 22 11 118 1.09 21 20 61
s-os-taillard-4-4 160 32 159 396.21 42 346 319.14 42 22 186 293.55 42 35 89
s-os-taillard-4-9 160 32 160 322.84 37 302 275.31 37 13 232 195.03 37 29 104
s-os-taillard-4-10 160 32 148 37.46 29 268 33.46 29 12 190 45.05 29 25 100
s-os-taillard-5-25 325 50 279 186.26 10 38 181.76 10 9 15 204.69 10 10 10
BlackHole-1 431 64 142 105.03 75 658 240.09 80 56 269 8.08 75 73 24
BlackHole-5 431 64 140 208.35 77 657 118.04 77 58 253 18.98 82 80 48
BlackHole-0 431 64 139 96.92 75 645 102.16 77 56 291 10.59 75 72 46
BlackHole-4 431 64 140 100.07 77 664 21.48 77 55 252 11.59 77 74 48
BlackHole-3 431 64 142 476.00 80 700 283.92 80 58 261 13.43 75 74 35

The full table can be downloaded at

Table 1: Some typical experimental results.

7 Perspectives and conclusion

Clearly, the local search scheme proposed in this paper improves the extraction of one by means of destructive strategies and opens many perspectives. Although dichotomy strategies, as explored in this paper, are known to be the most efficient ones, it could be interesting to graft this local search scheme to constructive or QuickXplain-like methods. Also, note that we have not tried to fine-tune the various parameters of this local search scheme. In this respect, it would be interesting to devise forms of dynamical settings for these parameters that better take the recorded information about the previous search steps into account, as explored in [14]. In the future, we plan to explore more advanced concepts that are related to transition constraints in the goal of better guiding the local search towards promising parts of the search space. Especially, so-called critical clauses [12] in the Boolean framework could be generalized in various ways in the full constraint networks setting. Exploring the possible ways according to which lstc could benefit from this is a promising path for further research.


This work has been partly supported by a grant from the Région Nord/Pas-de-Calais and by an EC FEDER grant.


  • [1] Gilles Audemard, Jean-Marie Lagniez, Bertrand Mazure, and Lakhdar Saïs. Boosting local search thanks to CDCL. In

    17th International Conference on Logic for Programming, Artificial Intelligence and Reasoning (LPAR’2010)

    , pages 474–488, 2010.
  • [2] René R. Bakker, F. Dikker, Frank Tempelman, and Petronella Maria Wognum. Diagnosing and solving over-determined constraint satisfaction problems. In Proceedings of the 13th International Joint Conference on Artificial Intelligence (IJCAI’93), volume 1, pages 276–281. Morgan Kaufmann, 1993.
  • [3] Anton Belov, Inês Lynce, and João Marques Silva. Towards efficient MUS extraction. AI Communications, 25:97–116, 2012.
  • [4] Anton Belov and João Marques Silva. MUSer2: An efficient MUS extractor, system description. Journal on Satisfiability, Boolean Modeling and Computation JSAT, 8:123–128, 2012.
  • [5] Anton Belov and João P. Marques Silva. Accelerating MUS extraction with recursive model rotation. In Proceedings of the International Conference on Formal Methods in Computer-Aided Design (FMCAD’2011), pages 37–40, 2011.
  • [6] Third international CSP solver competition, 2008.
  • [7] Fourth international constraint solver competition, 2009.
  • [8] J.L. de Siqueira N. and Jean-François Puget. Explanation-based generalization of failures. In Proceedings of the Eighth European Conference on Artificial Intelligence (ECAI’88), pages 339–344, 1988.
  • [9] Thomas Eiter and Georg Gottlob. On the complexity of propositional knowledge base revision, updates and counterfactual. Artificial Intelligence, 57:227–270, 1992.
  • [10] É. Grégoire, B. Mazure, and C. Piette. Extracting MUSes. In Proceedings of the 17th European Conference on Artificial Intelligence (ECAI’06), pages 387–391, 2006.
  • [11] Éric Grégoire, Jean-Marie Lagniez, and Bertrand Mazure. A CSP solver focusing on FAC variables. In 17th International Conference on Principles and Practice of Constraint Programming (CP’2011), pages 493–507. Lecture Notes in Computer Science 6876, Springer, 2011.
  • [12] Éric Grégoire, Bertrand Mazure, and Cédric Piette. Extracting MUSes. In 17th European Conference on Artificial Intelligence (ECAI’06), pages 387–391, 2006.
  • [13] Éric Grégoire, Bertrand Mazure, and Cédric Piette. Local-search extraction of MUSes. Constraints, 12(3):325–344, 2007.
  • [14] Éric Grégoire, Bertrand Mazure, and Cédric Piette. On finding minimally unsatisfiable cores of CSPs. International Journal on Artificial Intelligence Tools (IJAIT), 17(4):745 – 763, 2008.
  • [15] Éric Grégoire, Bertrand Mazure, Cédric Piette, and Lakhdar Saïs. A new heuristic-based albeit complete method to extract MUCs from unsatisfiable CSPs. In Proceedings of the IEEE International Conference on Information Reuse and Integration (IEEE-IRI’2006), pages 325–329, 2006.
  • [16] Fred Hémery, Christophe Lecoutre, Lakhdar Saïs, and Frédéric Boussemart. Boosting systematic search by weighting constraints. In Proceedings of the 16th European Conference on Artificial Intelligence (ECAI’2004), pages 482–486, 2004.
  • [17] Fred Hémery, Christophe Lecoutre, Lakhdar Saïs, and Frédéric Boussemart. Extracting MUCs from constraint networks. In 17th European Conference on Artificial Intelligence (ECAI’2006), pages 113–117, 2006.
  • [18] Ulrich Junker. QuickXplain: Conflict detection for arbitrary constraint propagation algorithms. In IJCAI’01 Workshop on Modelling and Solving Problems with Constraints (CONS-1), 2001.
  • [19] Ulrich Junker. QuickXplain: Preferred explanations and relaxations for over-constrained problems. In Proceedings of the 19th National Conference on Artificial Intelligence (AAAI’04), pages 167–172, 2004.
  • [20] Narendra Jussien and Vincent Barichard. The PaLM system: explanation-based constraint programming. In Proceedings of TRICS: Techniques foR Implementing Constraint programming Systems, a post-conference workshop of CP’00, pages 118–133, 2000.
  • [21] François Laburthe and The OCRE Project Team. Choco: implementing a CP kernel. In Proceedings of TRICS: Techniques foR Implementing Constraint programming Systems, a post-conference workshop of CP’00, 2000.
  • [22] Jean-Marie Lagniez, Éric Grégoire, and Bertrand Mazure. A data structure boosting the performance of local search for CSP solving. In International Conference on Metaheuristics and Nature Inspired Computing (META’12), 2012. (Paper also available from the authors’ webpages).
  • [23] Inês Lynce and João P. Marques Silva. On computing minimum unsatisfiable cores. In Proceedings of the 7th International Conference on Theory and Applications of Satisfiability Testing (SAT’04), 2004.
  • [24] João Marques-Silva, Mikoláš, and Anton Belov. Minimal sets over monotone predicates in boolean formulae. In Proceedings of the 25th International Conference on Computer-Aided Verification (CAV’2013), to appear, 2013.
  • [25] David McAllester, Bart Selman, and Henry A. Kautz. Evidence for invariants in local search. In Fourteenth National Conference on Artificial Intelligence (AAAI’97), pages 321–326, 1997.
  • [26] Paul Morris. The breakout method for escaping from local minima. In Proceedings of the Eleventh National Conference on Artificial Intelligence (AAAI’1993), pages 40–45. AAAI Press, 1993.
  • [27] Christos H. Papadimitriou and David Wolfe. The complexity of facets resolved. Journal of Computer and System Sciences, 37(1):2–13, 1988.
  • [28] Vadim Ryvchin and Ofer Strichman. Faster extraction of high-level minimal unsatisfiable cores. In Proceedings of the 14th International Conference on Theory and Applications of Satisfiability Testing (SAT’11), pages 174–187, 2011.
  • [29] Bart Selman, Henry A. Kautz, and Bram Cohen. Noise strategies for improving local search. In Twelfth National Conference on Artificial Intelligence (AAAI’1994), pages 337–343, 1994.
  • [30] Wieringa Siert. Understanding, improving and parallelizing MUS finding using model rotation. In 18th International Conference on Principles and Practice of Constraint Programming (CP’2012), pages 672–687. Lecture Notes in Computer Science 7514, Springer, 2012.
  • [31] João P. Marques Silva and Inês Lynce and. On improving MUS extraction algorithms. In Theory and Applications of Satisfiability Testing (SAT’11), Lecture Notes in Computer Science, pages 159–173. Springer, 2011.
  • [32] João P. Marques Silva and Inês Lynce. On improving MUS extraction algorithms. In Proceedings of the 14th International Conference on Theory and Applications of Satisfiability Testing (SAT’11), pages 159–173, 2011.
  • [33] Hans Van Maaren and Siert Wieringa. Finding guaranteed MUSes fast. In Proceedings of the 11th International Conference on Theory and Applications of Satisfiability Testing (SAT’08), pages 291–304, 2008.