Unsatisfiable Cores for Constraint Programming

05/08/2013 ∙ by Nicholas Downing, et al. ∙ The University of Melbourne 0

Constraint Programming (CP) solvers typically tackle optimization problems by repeatedly finding solutions to a problem while placing tighter and tighter bounds on the solution cost. This approach is somewhat naive, especially for soft-constraint optimization problems in which the soft constraints are mostly satisfied. Unsatisfiable-core approaches to solving soft constraint problems in Boolean Satisfiability (e.g. MAXSAT) force all soft constraints to hold initially. When solving fails they return an unsatisfiable core, as a set of soft constraints that cannot hold simultaneously. Using this information the problem is relaxed to allow certain soft constraint(s) to be violated and solving continues. Since Lazy Clause Generation (LCG) solvers can also return unsatisfiable cores we can adapt the MAXSAT unsatisfiable core approach to CP. We implement the original MAXSAT unsatisfiable core solving algorithms WPM1, MSU3 in a state-of-the-art LCG solver and show that there exist problems which benefit from this hybrid approach.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In this paper we consider how to make Constraint Programming (CP) solvers better at tackling soft-constraint problems. CP solvers typically tackle optimization problems using branch-and-bound, and although soft-constraint problems can easily be mapped into a CP optimization framework, a distinguishing feature is that we expect most soft constraints to hold, at least on typical problems. CP solvers rely heavily on propagation to cut down the search space, but soft constraints have little propagation ability (because even though the constraints are likely to be hold, we do not know for sure), and search takes over.

Hence, existing CP solvers are nearly always terrible at soft constraint problems, a deficiency made worse by the fact that typical search strategies are unaware of where the good solutions lie, such that in some cases, thousands of solutions must be enumerated before the solver gets close to proving optimality. Indeed problems which are good for CP are almost always better with Mixed Integer Programming (MIP) once we soften the constraints. Here, we consider a different approach, which is to lift the unsatisfiable-core solving approaches from MAXSAT and integrate them into a CP solver.

By aggressively assuming that soft constraints hold, including the intensional soft constraints characteristic of CP problems, we can either find a solution or an unsatisfiable core, as a set of soft constraints which cannot hold simultaneously. Given an unsatisfiable core we adjust our assumptions and proceed, until feasibility is reached. We show that this approach maps well to CP solvers (as long as they can derive an unsatisfiable core, which means in practice that the CP solver must use Lazy Clause Generation), and that there are problems can benefit enormously from such an approach.

2 Lazy Clause Generation (LCG)

We give a brief description of propagation-based solving and LCG, for more details see [12]. We consider problems consisting of constraints over integer variables , , , each with a given finite domain . A feasible solution is a valuation to the variables, which satisfies all constraints , and lies in the domain , i.e. .

A propagation solver keeps a domain restriction for each variable and considers only solutions that lie within . Solving interleaves propagation, which repeatedly applies propagators to remove unsupported values, and search which splits the domain of some variable and considers the resulting sub-problems. This continues until all variables are fixed (success) or failure is detected (backtrack and try another subproblem).

Lazy clause generation is implemented by introducing Boolean variables for each potential value of a CP variable, named , and for each bound, . Negating them gives and . Fixing such a literal modifies to make the corresponding fact true, and vice versa. Hence the literals give an alternate Boolean representation of the domain, which supports reasoning. Lazy clause generation makes use of clauses to record nogoods, where a clause is a disjunction of (or essentially just a set of) literals.

In a lazy clause generation solver, the actions of propagators (and search) to change domains are recorded in an implication graph over the literals. Whenever a propagator changes a domain it must explain how the change occurred in terms of literals, that is, each literal that is made true must be explained by a clause where is a conjunction of literals. When the propagator detects failure it must explain the failure as a nogood, , with a conjunction of literals which cannot hold simultaneously. Then is used for conflict analysis [11] to generate a nogood that explains the failure.

3 MAXSAT solving algorithms

Before discussing how we integrate MAXSAT solving methods into CP, we illustrate the original MAXSAT algorithms by means of a simple clausal example.

Example 1

Consider Boolean variables , , and clausal constraints

Then is an unsatisfiable core, because violates . Similarly is an unsatisfiable core, because violates . As a Boolean Satisfiability (SAT) problem this is infeasible. As a Maximum Satisfiability (MAXSAT) problem, we can allow some soft constraint(s) to be violated. For example if we relax and we find a solution , , with 2 violated constraints. Or if we relax we find a better solution , , which has only one constraint violated. ∎

For MAXSAT we minimize where is the number of clauses and 0 if holds or 1 if is violated. A generalization is weighted partial MAXSAT where given a weight attached to each soft clause , and hard clauses encoded by .

3.1 Branch-and-bound algorithm

For branch-and-bound we first convert the soft-constraint problem into a hard-constraint problem using violator variables . We rewrite to , that is, the original clause will now only be enforced if its violator is .

Example 2

Adding violator variables to the soft clauses of Example 1 yields the hard-constraint optimization problem

minimize such that

Violator variables which are disappear, whereas violator variables which are automatically satisfy their clause which hence plays no further role. ∎

inputs: clauses with weights over variables
outputs: valuation minimizing sum of weights of violated constraints
add violator variables to constraints: for where
while SAT solver finds a valuation to the clause set do
, where 1 for , 0 for /nonexistent
add a decomposition of the constraint to the clause set
Algorithm 1 Branch-and-bound for weighted partial MAXSAT

Branch-and-bound search is defined in Algorithm 1. We treat the problem as a hard-constraint satisfaction problem and simply find any solution , calculate its objective value and add a new constraint to the problem enforcing that the next solution found should have an improved objective value. When this fails, the most recent solution found (if any) is optimal.

The weakness of branch and bound for soft-constraint problems is that soft constraints do not propagate, so we need to set violator variables before the solver learns anything. In contrast branch-and-bound is very good for infeasible problems since it detects infeasibility in the first solve.

3.2 Fu and Malik (Msu1 or Wpm1) algorithm

Fu and Malik [6] proposed the Msu1 algorithm for MAXSAT solving, later generalized by Ansótegui et al. [1] to Wpm1 for the weighted case. These algorithms iterate through a series of infeasible SAT problems until feasibility is reached. When the SAT solver fails, it returns an unsatisfiable core as a set of clauses that cannot hold simultaneously. Soft clauses in the set are relaxed using a fresh set of violator variables which are constrained so that at most one is (atmost1 constraint), and solving continues. The first solution found is guaranteed to minimize the number of, or sum of weights of, violated clauses.

inputs: clauses with weights over variables
outputs: valuation minimizing sum of weights of violated constraints
repeat
if SAT solver finds valuation to the clause set then
; break
otherwise, SAT solver returns an unsatisfiable core
find minimum increase in implied by the core:
if then
; break
create a fresh set of violator variables we’ll call for now
for where do
if then
add a new copy of clause to the clause set with weight
relax the original copy of the clause:
add a decomposition of the atmost1 constraint to the clause set
delete all learnt clauses (or at least those invalidated by the above changes)
Algorithm 2 Wpm1 for weighted partial MAXSAT (Msu1 is a special case)

Wpm1 or equivalently Msu1 is defined in Algorithm 2. Solving the MAXSAT problem as SAT with soft clauses considered hard, we find either a solution or an unsatisfiable core. In the latter case, we create a new MAXSAT problem by encoding into it an allowance that we will not charge the first units of the penalty for violating the clauses in the unsatisfiable core. If multiple clauses of the core are violated or if violated clause(s) have weight greater than then the remaining violation will be charged as usual. The amount of penalty waived, accumulates in . Eventually the MAXSAT problem is solved with cost 0 (i.e. all soft clauses are satisfied), then is the optimal solution cost.

Note the similarity of Wpm1 with destructive lower bound search where we set the objective , solve, if that fails increase it by one, re-solve and repeat; the first solution found is optimal. Wpm1 does better by restricting where the violation is allowed to occur for and at each further stage in the search.

Example 3

To the problem of Example 1 we now add an extra variable and the extra clauses and . The first unsatisfiable core is . Rewriting these clauses with violator variables , , gives

where which we decompose to the additional hard clauses

Then solving fails by deriving the empty clause as shown in Figure 1. The leaves of this tree show that the next core is . Relaxing the problem again (including and that had already been relaxed) gives

This has a solution with , , , which violates original clauses and , and is optimal with cost 2. ∎ Figure 1: Example proof trace

3.3 Marques-Silva & Planes (Msu3) algorithm

The difficulty with Wpm1 is that it is extremely aggressive, in the sense that the only soft-constraint violations allowed are those which are already known to exist. On many problems the aggressive approach pays off, but on other problems either too many unsatisfiable cores need to be enumerated before achieving feasibility, or else the problems get more and more difficult to prove infeasible (the increasing number of atmost1 constraints leads to an exponential number of relaxation-variable assignments, some of which may be symmetric).

inputs: clauses with weights over variables
outputs: valuation minimizing sum of weights of violated constraints
for where do
add violator variable to constraint:
add temporary singleton clause:
repeat
if SAT solver finds a valuation to the clause set then
, where is treated as in Algorithm 1
add a decomposition of the constraint to the clause set
else
SAT solver returns an unsatisfiable core
if the unsatisfiable core contains no temporary clauses, i.e.  then
break
delete the identified temporary clauses from the problem
delete all learnt clauses (or at least those invalidated by the above change)
Algorithm 3 Msu3 for weighted partial MAXSAT

Marques-Silva & Planes [10] proposed Msu3 for solving problems which Wpm1 does not handle efficiently for the above reasons. Msu3 is defined in Algorithm 3. It is a hybrid unsatisfiable-core and branch-and-bound approach, which leverages some of the benefits of unsatisfiable-core solving, without being so aggressive. All soft constraints are considered hard initially, but each time an unsatisfiable core is found, all constraints in the set revert to soft, then the ordinary branch-and-bound process continues, to minimize their violations.

Msu3 resembles binary search, we probe an initially overconstrained problem, if that fails we relax it, otherwise we constrain to find a better solution.

Example 4

Rewriting Example 1 in the required format gives the same problem described in Example 2 plus the additional temporary clauses

Solving fails with unsatisfiable core . Removing , , from the problem yields the solution , , . Constraining propagates and returns the unsatisfiable core , which has no temporaries, hence solving terminates. ∎

4 Unsatisfiable cores for LCG

We can straightforwardly adapt the previously-described soft-constraint optimization approaches to CP. A soft intensional constraint is represented as a half-reified constraint [5] of the form where is the indicator variable for the constraint . If is then the constraint holds, and if is then the constraint has no effect.

Note that a CP solver which has a propagator for the constraint can straightforwardly be extended to provide a half-reified version of the constraint. Furthermore, the explanation algorithm for for in an LCG solver can also easily be extended to explain the half-reified version.

By adding indicator variables we effectively map the soft constraint problem to a MAXSAT problem. If the soft intensional constraint has a weight then we add as a soft, singleton, indicator clause with weight . Now we can apply Wpm1 or Msu3 effectively unchanged on the weighted indicator clauses. Soft constraints are enforced when their indicator clauses hold. Unsatisfiable cores enumerate conflicting indicator clauses and hence soft constraints.

For Wpm1 (Algorithm 2) these indicator clauses play an important role, as they will be progressively relaxed and won’t necessarily be singletons by the end of solving. For branch-and-bound (Algorithm 1) and Msu3 (Algorithm 3), the indicator clauses disappear (leaving only temporary clauses in the Msu3 case), because instead of augmenting the indicator clauses with violators and creating a useless implication, we can simply equate with .

To make use of soft global constraints that return a number of violations, we can simply use literals encoding the integer violation count. For example the constraint soft_alldifferent( enforces that is a violation count, e.g. the number of pairs . The usual LCG encoding of creates bounds variables , , etc. We can make an indicator clause from each of these literals with weights equal to the marginal cost of each soft_alldifferent violation, and hence map to a weighted soft clause problem.

5 Experimental evaluation

To illustrate the potential usefulness of unsatisfiable core based optimization for CP, we consider a soft constraint variant of the Resource Constrained Project Scheduling Problems (RCPSP). Rather than minimize makespan, we constrain the makespan to be some percentage of the optimal makespan, and soften all the precedences. These problems are similar to RCPSP where the aim is to minimize the number of tardy jobs (that finish after their specified due date).

To create instances we take each RCPSP/max [2] instance from the sets ubo20, sm_j30, ubo50 in PSPLib [7], which are systematically generated by ProGen/max [13], and a proven lower bound on its minimum makespan, usually the optimal makespan. We constrain all tasks in to complete before time for each . We maximize the number, or in a second experiment, the sum of randomly chosen weights , of precedences that hold.

We aim to show that Wpm1 and Msu3 can be advantageous over branch-and-bound, hence we run an LCG solver with all three methods. A secondary aim is to show that LCG-based unsatisfiable core approaches can be superior to other solving technologies, so we provide best known decompositions to pseudo-Boolean (PB), MAXSAT and MIP, and evaluate them on (i) SAT-based PB solver MiniSAT+ 19/11/2012 [4], (ii) unsatisfiability-based MAXSAT solver MSUnCore 6/6/2011 [10], and (iii) MIP solvers CPLEX 12.4 and SCIP 3.0.1 .

For our own solver we used CPX, a state-of-the-art LCG solver, but modified to implement the Wpm1 and Msu3 algorithms. CPX and SCIP both use learning and a built-in cumulative propagator. In these tests CPX uses activity-based search with phase saving and geometric restarts; the other solvers use their default searches which are similar, at least for the SAT-based solvers.

We use a cluster of Dell PowerEdge 1950 with 2 2.0 GHz Intel Quad Core Xeon E5405, 26MB Cache, 16 GB RAM, 600s timeouts, and 1GB memory limit per core. Data files are available from http://www.csse.unimelb.edu.au/~pjs/unsat_core. We disregard infeasible instances, where branch-and-bound will always be superior; almost all soft-constraint problems of interest are feasible. We also disregard instances for which all solvers timed out.

cardinality version
#ins cpx b&b cpx msu1 cpx msu3 sat b&b sat msu1 sat msu3 cplex scip
ubo20 53 0.386 0 0.142 4 0.091 0 1.046 0 1.083 1 1.312 0 16.350 7 120.897 16
j30 126 2.333 8 0.197 10 0.188 5 2.317 0 1.838 2 2.400 0 32.121 30 187.327 68
ubo50 47 16.543 6 0.775 4 0.894 2 515.777 36 349.598 20 475.797 33 174.902 21 531.704 44
#ins cpx b&b cpx msu1 cpx msu3 sat b&b sat msu1 sat msu3 cplex scip
ubo20 61 0.236 0 0.038 1 0.038 0 0.933 0 0.773 1 1.072 0 9.059 7 77.723 14
j30 151 1.838 17 0.097 16 0.125 10 2.737 1 1.895 6 2.866 2 22.611 33 155.496 71
ubo50 56 9.668 7 0.269 1 0.398 0 446.866 39 314.035 24 458.802 39 178.648 25 523.275 51
#ins cpx b&b cpx msu1 cpx msu3 sat b&b sat msu1 sat msu3 cplex scip
ubo20 64 0.124 0 0.016 1 0.022 0 0.865 0 0.598 0 0.937 0 6.727 4 61.616 10
j30 176 1.729 10 0.057 9 0.087 5 2.281 1 1.541 2 2.386 3 20.832 36 146.206 75
ubo50 63 6.540 3 0.097 2 0.227 2 385.716 41 275.759 29 380.389 46 161.992 23 504.252 56
weighted version
#ins cpx b&b cpx wpm1 cpx msu3 sat b&b sat wpm1 sat msu3 cplex scip
ubo20 53 0.320 0 0.118 9 0.091 0 1.392 0 2.670 8 1.917 0 17.747 4 96.520 17
j30 128 1.420 6 0.154 21 0.181 2 3.537 1 3.236 12 3.298 1 29.061 34 176.513 71
ubo50 48 10.783 3 0.548 8 0.871 1 549.042 40 329.905 19 501.536 37 168.355 24 516.557 44
#ins cpx b&b cpx wpm1 cpx msu3 sat b&b sat wpm1 sat msu3 cplex scip
ubo20 61 0.196 0 0.058 6 0.048 0 1.040 0 1.080 3 1.140 0 7.813 5 59.778 14
j30 151 1.250 8 0.081 16 0.137 8 3.349 1 2.556 15 2.987 2 21.072 31 147.957 69
ubo50 56 9.252 2 0.167 2 0.449 0 476.299 39 300.131 20 439.167 38 129.856 22 455.111 46
#ins cpx b&b cpx wpm1 cpx msu3 sat b&b sat wpm1 sat msu3 cplex scip
ubo20 64 0.137 0 0.015 2 0.027 0 1.015 0 0.678 2 0.940 0 4.967 4 36.493 7
j30 177 1.206 5 0.060 14 0.109 4 2.811 1 1.912 11 2.376 1 17.888 40 128.818 74
ubo50 62 6.739 2 0.077 2 0.314 0 411.847 44 232.487 20 373.572 40 102.812 22 438.892 54
Table 1: Comparative results for soft precedence RCPSP problems

The results shown in Table 1 compare our solver CPX using branch-and-bound, Wpm1 (or Msu1 as a special case) and Msu3 (first three columns) with the SAT-based solvers MiniSAT+ using branch-and-bound and MSUnCore using Wpm1 or Msu3

(next three columns) and the MIP solvers (last two columns). The numbers shown are geometric mean of solving time (s; using 600s as solving time for instances that timed out), followed by number of timeouts. The best solving time and (equal-)best number of timeouts are highlighted.

The results show that both Wpm1 and Msu3 can be highly advantageous over branch-and-bound when used with a learning CP solver. As the makespan constraint becomes more generous, the percentage of soft constraints that can hold increases, and the advantages of Wpm1 and Msu3 over branch-and-bound (in assuming that soft constraints hold), become more pronounced.

As the problems become larger, the huge numbers of variables created by the decompositions begin to overwhelm the SAT/MIP solvers, demonstrating the importance of using an LCG solving approach in addition to the unsatisfiability-based algorithms Wpm1 and Msu3 already available in the MSUnCore solver.

6 Related work and conclusion

Specialized solvers [3, 8] have been highly successful for soft-constraint CSPs in extensional form. These approaches are similar to Wpm1 as they both effectively shift part of the cost function as inconsistencies are detected. But many problems (such as the scheduling problem we investigate) are not feasible to encode using extensional constraints only. We are unaware of any other approaches to soft intensionally defined constraint problems beyond branch-and-bound, apart from PBO/WBO [4, 9] which support intensionally-defined linear constraints only.

In this paper we demonstrate how to use unsatisfiable-core methods developed for MAXSAT to solve CP optimization problems containing soft constraints, by making use of the facility of LCG solvers to generate unsatisfiable cores. The results clearly show that CP solvers should incorporate unsatisfiable core optimization algorithms, since they can be dramatically superior to branch-and-bound on appropriate problems.

References

  • [1] C. Ansótegui, M. Bonet, and J. Levy (2009) Solving (Weighted) Partial MaxSAT through Satisfiability Testing. In Theory and Applications of Satisfiability Testing - SAT 2009, O. Kullmann (Ed.), Lecture Notes in Computer Science, Vol. 5584, pp. 427–440. External Links: ISBN 978-3-642-02776-5 Cited by: §3.2.
  • [2] M. Bartusch, R.H. Möhring, and F.J. Radermacher (1988) Scheduling project networks with resource constraints and time windows. Annals of Operations Research 16 (1), pp. 199–240 (English). External Links: ISSN 0254-5330 Cited by: §5.
  • [3] S. de Givry, F. Heras, M. Zytnicki, and J. Larrosa (2005) Existential arc consistency: Getting closer to full arc consistency in weighted CSPs. In International Joint Conference on AI - IJCAI05, pp. 193–198. Cited by: §6.
  • [4] N. Eén and N. Sörensson (2006) Translating Pseudo-Boolean Constraints into SAT. JSAT 2 (1-4), pp. 1–26. Cited by: item i, §6.
  • [5] T. Feydy, Z. Somogyi, and P. Stuckey (2011) Half Reification and Flattening. In Principles and Practice of Constraint Programming – CP 2011, J. Lee (Ed.), Lecture Notes in Computer Science, Vol. 6876, pp. 286–301. External Links: ISBN 978-3-642-23785-0 Cited by: §4.
  • [6] Z. Fu and S. Malik (2006) On Solving the Partial MAX-SAT Problem. In Theory and Applications of Satisfiability Testing - SAT 2006, A. Biere and C. P. Gomes (Eds.), Lecture Notes in Computer Science, Vol. 4121, pp. 252–265. External Links: ISBN 978-3-540-37206-6 Cited by: §3.2.
  • [7] R. Kolisch and A. Sprecher (1997) {PSPLIB} - A project scheduling problem library: {OR} Software - {ORSEP} Operations Research Software Exchange Program. European Journal of Operational Research 96 (1), pp. 205–216. External Links: ISSN 0377-2217 Cited by: §5.
  • [8] J. Larrosa and T. Schiex (2003) In the quest of the best form of local consistency for Weighted CSP. In International Joint Conference on AI - IJCAI03, pp. 239–244. Cited by: §6.
  • [9] V. Manquinho, J. Marques-Silva, and J. Planes (2009) Algorithms for Weighted Boolean Optimization. In Theory and Applications of Satisfiability Testing - SAT 2009, O. Kullmann (Ed.), Lecture Notes in Computer Science, Vol. 5584, pp. 495–508. External Links: ISBN 978-3-642-02776-5 Cited by: §6.
  • [10] J. Marques-Sila and J. Planes (2011) Algorithms for Maximum Satisfiability Using Unsatisfiable Cores. In Advanced Techniques in Logic Synthesis, Optimizations and Applications, K. Gulati (Ed.), pp. 171–182 (English). External Links: ISBN 978-1-4419-7517-1 Cited by: §3.3, item ii.
  • [11] M. W. Moskewicz, C. F. Madigan, Y. Zhao, L. Zhang, and S. Malik (2001) Chaff: engineering an efficient SAT solver. In Proceedings of the 38th annual Design Automation Conference, DAC ’01, New York, NY, USA, pp. 530–535. External Links: ISBN 1-58113-297-2 Cited by: §2.
  • [12] O. Ohrimenko, P. J. Stuckey, and M. Codish (2009) Propagation via lazy clause generation. Constraints 14, pp. 357–391 (English). External Links: ISSN 1383-7133 Cited by: §2.
  • [13] C. Schwindt (1995) ProGen/max: A New Problem Generator for Different Resource-Constrained Project Scheduling Problems with Minimal and Maximal Time Lags. Technical report Technical Report WIOR 449, Universität Karlsruhe. Cited by: §5.