Learning Robust Search Strategies Using a Bandit-Based Approach

05/10/2018
by   Wei Xia, et al.
0

Effective solving of constraint problems often requires choosing good or specific search heuristics. However, choosing or designing a good search heuristic is non-trivial and is often a manual process. In this paper, rather than manually choosing/designing search heuristics, we propose the use of bandit-based learning techniques to automatically select search heuristics. Our approach is online where the solver learns and selects from a set of heuristics during search. The goal is to obtain automatic search heuristics which give robust performance. Preliminary experiments show that our adaptive technique is more robust than the original search heuristics. It can also outperform the original heuristics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2018

Correlation Heuristics for Constraint Programming

Effective general-purpose search strategies are an important component i...
research
01/23/2013

Mini-Bucket Heuristics for Improved Search

The paper is a second in a series of two papers evaluating the power of ...
research
07/23/2019

Enhancing Dynamic Symbolic Execution by Automatically Learning Search Heuristics

We present a technique to automatically generate search heuristics for d...
research
09/29/2011

Multiple-Goal Heuristic Search

This paper presents a new framework for anytime heuristic search where t...
research
08/17/2018

Heuristics for publishing dynamic content as structured data with schema.org

Publishing fast changing dynamic data as open data on the web in a scala...
research
01/18/2014

Counting-Based Search: Branching Heuristics for Constraint Satisfaction Problems

Designing a search heuristic for constraint programming that is reliable...
research
01/05/2023

Training a Deep Q-Learning Agent Inside a Generic Constraint Programming Solver

Constraint programming is known for being an efficient approach for solv...

Please sign up or login with your details

Forgot password? Click here to reset