Practical Multi-fidelity Bayesian Optimization for Hyperparameter Tuning

03/12/2019
by   Jian Wu, et al.
12

Bayesian optimization is popular for optimizing time-consuming black-box objectives. Nonetheless, for hyperparameter tuning in deep neural networks, the time required to evaluate the validation error for even a few hyperparameter settings remains a bottleneck. Multi-fidelity optimization promises relief using cheaper proxies to such objectives --- for example, validation error for a network trained using a subset of the training points or fewer iterations than required for convergence. We propose a highly flexible and practical approach to multi-fidelity Bayesian optimization, focused on efficiently optimizing hyperparameters for iteratively trained supervised learning models. We introduce a new acquisition function, the trace-aware knowledge-gradient, which efficiently leverages both multiple continuous fidelity controls and trace observations --- values of the objective at a sequence of fidelities, available when varying fidelity using training iterations. We provide a provably convergent method for optimizing our acquisition function and show it outperforms state-of-the-art alternatives for hyperparameter tuning of deep neural networks and large-scale kernel learning.

READ FULL TEXT

page 8

page 9

research
02/23/2020

Weighting Is Worth the Wait: Bayesian Optimization with Importance Sampling

Many contemporary machine learning models require extensive tuning of hy...
research
11/12/2021

A Simple and Fast Baseline for Tuning Large XGBoost Models

XGBoost, a scalable tree boosting algorithm, has proven effective for ma...
research
09/26/2022

Improving Multi-fidelity Optimization with a Recurring Learning Rate for Hyperparameter Tuning

Despite the evolution of Convolutional Neural Networks (CNNs), their per...
research
12/07/2022

Optimizing a Digital Twin for Fault Diagnosis in Grid Connected Inverters – A Bayesian Approach

In this paper, a hyperparameter tuning based Bayesian optimization of di...
research
06/18/2021

Batch Multi-Fidelity Bayesian Optimization with Deep Auto-Regressive Networks

Bayesian optimization (BO) is a powerful approach for optimizing black-b...
research
09/16/2021

Automatic prior selection for meta Bayesian optimization with a case study on tuning deep neural network optimizers

The performance of deep neural networks can be highly sensitive to the c...
research
11/17/2022

Bayesian Optimization of 2D Echocardiography Segmentation

Bayesian Optimization (BO) is a well-studied hyperparameter tuning techn...

Please sign up or login with your details

Forgot password? Click here to reset