Batch Multi-Fidelity Bayesian Optimization with Deep Auto-Regressive Networks

06/18/2021
by   Shibo Li, et al.
0

Bayesian optimization (BO) is a powerful approach for optimizing black-box, expensive-to-evaluate functions. To enable a flexible trade-off between the cost and accuracy, many applications allow the function to be evaluated at different fidelities. In order to reduce the optimization cost while maximizing the benefit-cost ratio, in this paper, we propose Batch Multi-fidelity Bayesian Optimization with Deep Auto-Regressive Networks (BMBO-DARN). We use a set of Bayesian neural networks to construct a fully auto-regressive model, which is expressive enough to capture strong yet complex relationships across all the fidelities, so as to improve the surrogate learning and optimization performance. Furthermore, to enhance the quality and diversity of queries, we develop a simple yet efficient batch querying method, without any combinatorial search over the fidelities. We propose a batch acquisition function based on Max-value Entropy Search (MES) principle, which penalizes highly correlated queries and encourages diversity. We use posterior samples and moment matching to fulfill efficient computation of the acquisition function and conduct alternating optimization over every fidelity-input pair, which guarantees an improvement at each step. We demonstrate the advantage of our approach on four real-world hyperparameter optimization applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2020

Multi-Fidelity Bayesian Optimization via Deep Neural Networks

Bayesian optimization (BO) is a popular framework to optimize black-box ...
research
11/05/2018

Practical Batch Bayesian Optimization for Less Expensive Functions

Bayesian optimization (BO) and its batch extensions are successful for o...
research
03/12/2019

Practical Multi-fidelity Bayesian Optimization for Hyperparameter Tuning

Bayesian optimization is popular for optimizing time-consuming black-box...
research
08/16/2019

BOAH: A Tool Suite for Multi-Fidelity Bayesian Optimization & Analysis of Hyperparameters

Hyperparameter optimization and neural architecture search can become pr...
research
02/26/2021

Batch Bayesian Optimization on Permutations using Acquisition Weighted Kernels

In this work we propose a batch Bayesian optimization method for combina...
research
09/30/2022

Efficient computation of the Knowledge Gradient for Bayesian Optimization

Bayesian optimization is a powerful collection of methods for optimizing...
research
09/01/2021

LinEasyBO: Scalable Bayesian Optimization Approach for Analog Circuit Synthesis via One-Dimensional Subspaces

A large body of literature has proved that the Bayesian optimization fra...

Please sign up or login with your details

Forgot password? Click here to reset