DeepAI AI Chat
Log In Sign Up

Batch Multi-Fidelity Bayesian Optimization with Deep Auto-Regressive Networks

by   Shibo Li, et al.

Bayesian optimization (BO) is a powerful approach for optimizing black-box, expensive-to-evaluate functions. To enable a flexible trade-off between the cost and accuracy, many applications allow the function to be evaluated at different fidelities. In order to reduce the optimization cost while maximizing the benefit-cost ratio, in this paper, we propose Batch Multi-fidelity Bayesian Optimization with Deep Auto-Regressive Networks (BMBO-DARN). We use a set of Bayesian neural networks to construct a fully auto-regressive model, which is expressive enough to capture strong yet complex relationships across all the fidelities, so as to improve the surrogate learning and optimization performance. Furthermore, to enhance the quality and diversity of queries, we develop a simple yet efficient batch querying method, without any combinatorial search over the fidelities. We propose a batch acquisition function based on Max-value Entropy Search (MES) principle, which penalizes highly correlated queries and encourages diversity. We use posterior samples and moment matching to fulfill efficient computation of the acquisition function and conduct alternating optimization over every fidelity-input pair, which guarantees an improvement at each step. We demonstrate the advantage of our approach on four real-world hyperparameter optimization applications.


page 1

page 2

page 3

page 4


Multi-Fidelity Bayesian Optimization via Deep Neural Networks

Bayesian optimization (BO) is a popular framework to optimize black-box ...

Practical Batch Bayesian Optimization for Less Expensive Functions

Bayesian optimization (BO) and its batch extensions are successful for o...

Practical Multi-fidelity Bayesian Optimization for Hyperparameter Tuning

Bayesian optimization is popular for optimizing time-consuming black-box...

BOAH: A Tool Suite for Multi-Fidelity Bayesian Optimization & Analysis of Hyperparameters

Hyperparameter optimization and neural architecture search can become pr...

Deep Multi-Fidelity Active Learning of High-dimensional Outputs

Many applications, such as in physical simulation and engineering design...

Efficient computation of the Knowledge Gradient for Bayesian Optimization

Bayesian optimization is a powerful collection of methods for optimizing...

LinEasyBO: Scalable Bayesian Optimization Approach for Analog Circuit Synthesis via One-Dimensional Subspaces

A large body of literature has proved that the Bayesian optimization fra...