Parallel Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints

Real-world problems often involve the optimization of several objectives under multiple constraints. Furthermore, we may not have an expression for each objective or constraint; they may be expensive to evaluate; and the evaluations can be noisy. These functions are referred to as black-boxes. Bayesian optimization (BO) can efficiently solve the problems described. For this, BO iteratively fits a model to the observations of each black-box. The models are then used to choose where to evaluate the black-boxes next, with the goal of solving the optimization problem in a few iterations. In particular, they guide the search for the problem solution, and avoid evaluations in regions of little expected utility. A limitation, however, is that current BO methods for these problems choose a point at a time at which to evaluate the black-boxes. If the expensive evaluations can be carried out in parallel (as when a cluster of computers is available), this results in a waste of resources. Here, we introduce PPESMOC, Parallel Predictive Entropy Search for Multi-objective Optimization with Constraints, a BO strategy for solving the problems described. PPESMOC selects, at each iteration, a batch of input locations at which to evaluate the black-boxes, in parallel, to maximally reduce the entropy of the problem solution. To our knowledge, this is the first batch method for constrained multi-objective BO. We present empirical evidence in the form of synthetic, benchmark and real-world experiments that illustrate the effectiveness of PPESMOC.

READ FULL TEXT
research
09/05/2016

Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints

This work presents PESMOC, Predictive Entropy Search for Multi-objective...
research
10/18/2021

A portfolio approach to massively parallel Bayesian optimization

One way to reduce the time of conducting optimization studies is to eval...
research
03/06/2019

Efficient Multi-Objective Optimization through Population-based Parallel Surrogate Search

Multi-Objective Optimization (MOO) is very difficult for expensive funct...
research
07/08/2021

Many Objective Bayesian Optimization

Some real problems require the evaluation of expensive and noisy objecti...
research
06/21/2017

Constrained Bayesian Optimization with Noisy Experiments

Randomized experiments are the gold standard for evaluating the effects ...
research
04/15/2022

Investigating Positive and Negative Qualities of Human-in-the-Loop Optimization for Designing Interaction Techniques

Designers reportedly struggle with design optimization tasks where they ...
research
06/27/2022

A penalisation method for batch multi-objective Bayesian optimisation with application in heat exchanger design

We present HIghly Parallelisable Pareto Optimisation (HIPPO) – a batch a...

Please sign up or login with your details

Forgot password? Click here to reset