Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization

10/18/2022
by   Samuel (Sam) Daulton, et al.
16

Optimizing expensive-to-evaluate black-box functions of discrete (and potentially continuous) design parameters is a ubiquitous problem in scientific and engineering applications. Bayesian optimization (BO) is a popular, sample-efficient method that leverages a probabilistic surrogate model and an acquisition function (AF) to select promising designs to evaluate. However, maximizing the AF over mixed or high-cardinality discrete search spaces is challenging standard gradient-based methods cannot be used directly or evaluating the AF at every point in the search space would be computationally prohibitive. To address this issue, we propose using probabilistic reparameterization (PR). Instead of directly optimizing the AF over the search space containing discrete parameters, we instead maximize the expectation of the AF over a probability distribution defined by continuous parameters. We prove that under suitable reparameterizations, the BO policy that maximizes the probabilistic objective is the same as that which maximizes the AF, and therefore, PR enjoys the same regret bounds as the original BO policy using the underlying AF. Moreover, our approach provably converges to a stationary point of the probabilistic objective under gradient ascent using scalable, unbiased estimators of both the probabilistic objective and its gradient. Therefore, as the number of starting points and gradient steps increase, our approach will recover of a maximizer of the AF (an often-neglected requisite for commonly used BO regret bounds). We validate our approach empirically and demonstrate state-of-the-art optimization performance on a wide range of real-world applications. PR is complementary to (and benefits) recent work and naturally generalizes to settings with multiple objectives and black-box constraints.

READ FULL TEXT
research
10/21/2022

Local Bayesian optimization via maximizing probability of descent

Local optimization presents a promising approach to expensive, high-dime...
research
07/02/2019

Mixed-Variable Bayesian Optimization

The optimization of expensive to evaluate, black-box, mixed-variable fun...
research
07/02/2022

Tree ensemble kernels for Bayesian optimization with known constraints over mixed-feature spaces

Tree ensembles can be well-suited for black-box optimization tasks such ...
research
06/08/2023

Gradient-Informed Quality Diversity for the Illumination of Discrete Spaces

Quality Diversity (QD) algorithms have been proposed to search for a lar...
research
02/15/2022

Robust Multi-Objective Bayesian Optimization Under Input Noise

Bayesian optimization (BO) is a sample-efficient approach for tuning des...
research
01/19/2020

Distributionally Robust Bayesian Quadrature Optimization

Bayesian quadrature optimization (BQO) maximizes the expectation of an e...
research
06/08/2021

Bayesian Optimization over Hybrid Spaces

We consider the problem of optimizing hybrid structures (mixture of disc...

Please sign up or login with your details

Forgot password? Click here to reset