On Bayesian Search for the Feasible Space Under Computationally Expensive Constraints

04/23/2020
by   Alma Rahat, et al.
7

We are often interested in identifying the feasible subset of a decision space under multiple constraints. However, in cases where the constraints cannot be represented by analytical formulae, the cost of solving these problems can be prohibitive, since the only way to determine feasibility is to run computationally or financially expensive simulations. We propose a novel approach for this problem: we learn a surrogate classifier that can rapidly and accurately identify feasible solutions using only a very limited number of samples (11n, where n is the dimension of the decision space) obviating the need for full simulations. This is a data-efficient active-learning approach using Gaussian processes (GPs), a form of Bayesian regression models, and we refer to this method as Bayesian search. Using a small training set to begin with, we train a GP model for each constraint. The algorithm then identifies the next decision vector to expensively evaluate using an acquisition function. We subsequently augment the training data set with each newly evaluated solution, improving the accuracy of the estimated feasibility on each step. This iterative process continues until the limit on the number of expensive evaluations is reached. Initially, we adapted acquisition functions from the reliability engineering literature for this purpose. However, these acquisition functions do not appropriately consider the uncertainty in predictions offered by the GP models. We, therefore, introduce a new acquisition function to account for this. The new acquisition function combines the probability that a solution lies at the boundary between feasible and infeasible spaces representing exploitation) as well as the entropy in predictions (representing exploration). In our experiments, the best classifier has a median informedness of at least 97.95% across five of the G problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2022

Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

Gaussian processes (GPs) are the main surrogate functions used for seque...
research
08/25/2017

Active Expansion Sampling for Learning Feasible Domains in an Unbounded Input Space

Many engineering problems require identifying feasible domains under imp...
research
06/15/2019

Global optimization via inverse distance weighting

Global optimization problems whose objective function is expensive to ev...
research
12/13/2019

Active emulation of computer codes with Gaussian processes – Application to remote sensing

Many fields of science and engineering rely on running simulations with ...
research
08/27/2021

Approximate Bayesian Optimisation for Neural Networks

A body of work has been done to automate machine learning algorithm to h...
research
05/24/2021

Entropy-based adaptive design for contour finding and estimating reliability

In reliability analysis, methods used to estimate failure probability ar...

Please sign up or login with your details

Forgot password? Click here to reset