Many Objective Bayesian Optimization

07/08/2021
by   Lucia Asencio-Martín, et al.
0

Some real problems require the evaluation of expensive and noisy objective functions. Moreover, the analytical expression of these objective functions may be unknown. These functions are known as black-boxes, for example, estimating the generalization error of a machine learning algorithm and computing its prediction time in terms of its hyper-parameters. Multi-objective Bayesian optimization (MOBO) is a set of methods that has been successfully applied for the simultaneous optimization of black-boxes. Concretely, BO methods rely on a probabilistic model of the objective functions, typically a Gaussian process. This model generates a predictive distribution of the objectives. However, MOBO methods have problems when the number of objectives in a multi-objective optimization problem are 3 or more, which is the many objective setting. In particular, the BO process is more costly as more objectives are considered, computing the quality of the solution via the hyper-volume is also more costly and, most importantly, we have to evaluate every objective function, wasting expensive computational, economic or other resources. However, as more objectives are involved in the optimization problem, it is highly probable that some of them are redundant and not add information about the problem solution. A measure that represents how similar are GP predictive distributions is proposed. We also propose a many objective Bayesian optimization algorithm that uses this metric to determine whether two objectives are redundant. The algorithm stops evaluating one of them if the similarity is found, saving resources and not hurting the performance of the multi-objective BO algorithm. We show empirical evidence in a set of toy, synthetic, benchmark and real experiments that GPs predictive distributions of the effectiveness of the metric and the algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2016

Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints

This work presents PESMOC, Predictive Entropy Search for Multi-objective...
research
01/20/2021

A Similarity Measure of Gaussian Process Predictive Distributions

Some scenarios require the computation of a predictive distribution of a...
research
02/18/2019

The Kalai-Smorodinski solution for many-objective Bayesian optimization

An ongoing aim of research in multiobjective Bayesian optimization is to...
research
03/19/2021

Multicriteria asset allocation in practice

In this paper we consider the strategic asset allocation of an insurance...
research
06/12/2017

Dealing with Integer-valued Variables in Bayesian Optimization with Gaussian Processes

Bayesian optimization (BO) methods are useful for optimizing functions t...
research
04/01/2020

Parallel Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints

Real-world problems often involve the optimization of several objectives...

Please sign up or login with your details

Forgot password? Click here to reset