No-PASt-BO: Normalized Portfolio Allocation Strategy for Bayesian Optimization

08/01/2019
by   Thiago de P. Vasconcelos, et al.
0

Bayesian Optimization (BO) is a framework for black-box optimization that is especially suitable for expensive cost functions. Among the main parts of a BO algorithm, the acquisition function is of fundamental importance, since it guides the optimization algorithm by translating the uncertainty of the regression model in a utility measure for each point to be evaluated. Considering such aspect, selection and design of acquisition functions are one of the most popular research topics in BO. Since no single acquisition function was proved to have better performance in all tasks, a well-established approach consists of selecting different acquisition functions along the iterations of a BO execution. In such an approach, the GP-Hedge algorithm is a widely used option given its simplicity and good performance. Despite its success in various applications, GP-Hedge shows an undesirable characteristic of accounting on all past performance measures of each acquisition function to select the next function to be used. In this case, good or bad values obtained in an initial iteration may impact the choice of the acquisition function for the rest of the algorithm. This fact may induce a dominant behavior of an acquisition function and impact the final performance of the method. Aiming to overcome such limitation, in this work we propose a variant of GP-Hedge, named No-PASt-BO, that reduce the influence of far past evaluations. Moreover, our method presents a built-in normalization that avoids the functions in the portfolio to have similar probabilities, thus improving the exploration. The obtained results on both synthetic and real-world optimization tasks indicate that No-PASt-BO presents competitive performance and always outperforms GP-Hedge.

READ FULL TEXT
research
04/26/2021

One-parameter family of acquisition functions for efficient global optimization

Bayesian optimization (BO) with Gaussian processes is a powerful methodo...
research
10/12/2019

Bayesian Optimization using Pseudo-Points

Bayesian optimization (BO) is a popular approach for expensive black-box...
research
12/02/2021

Bayesian Optimization over Permutation Spaces

Optimizing expensive to evaluate black-box functions over an input space...
research
06/22/2022

Dynamic Multi-objective Ensemble of Acquisition Functions in Batch Bayesian Optimization

Bayesian optimization (BO) is a typical approach to solve expensive opti...
research
06/14/2020

Recursive Two-Step Lookahead Expected Payoff for Time-Dependent Bayesian Optimization

We propose a novel Bayesian method to solve the maximization of a time-d...
research
02/16/2023

Enhancing High-dimensional Bayesian Optimization by Optimizing the Acquisition Function Maximizer Initialization

Bayesian optimization (BO) is widely used to optimize black-box function...
research
01/04/2018

PHOENICS: A universal deep Bayesian optimizer

In this work we introduce PHOENICS, a probabilistic global optimization ...

Please sign up or login with your details

Forgot password? Click here to reset