BOHB: Robust and Efficient Hyperparameter Optimization at Scale

by   Stefan Falkner, et al.

Modern deep learning methods are very sensitive to many hyperparameters, and, due to the long training times of state-of-the-art models, vanilla Bayesian hyperparameter optimization is typically computationally infeasible. On the other hand, bandit-based configuration evaluation approaches based on random search lack guidance and do not converge to the best configurations as quickly. Here, we propose to combine the benefits of both Bayesian optimization and bandit-based methods, in order to achieve the best of both worlds: strong anytime performance and fast convergence to optimal configurations. We propose a new practical state-of-the-art hyperparameter optimization method, which consistently outperforms both Bayesian optimization and Hyperband on a wide range of problem types, including high-dimensional toy functions, support vector machines, feed-forward neural networks, Bayesian neural networks, deep reinforcement learning, and convolutional neural networks. Our method is robust and versatile, while at the same time being conceptually simple and easy to implement.


page 1

page 2

page 3

page 4


Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets

Bayesian optimization has become a successful tool for hyperparameter op...

Fast Hyperparameter Optimization of Deep Neural Networks via Ensembling Multiple Surrogates

The performance of deep neural networks crucially depends on good hyperp...

Combination of Hyperband and Bayesian Optimization for Hyperparameter Optimization in Deep Learning

Deep learning has achieved impressive results on many problems. However,...

DC and SA: Robust and Efficient Hyperparameter Optimization of Multi-subnetwork Deep Learning Models

We present two novel hyperparameter optimization strategies for optimiza...

Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization

Performance of machine learning algorithms depends critically on identif...

DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization

Modern machine learning algorithms crucially rely on several design deci...

Tabular Benchmarks for Joint Architecture and Hyperparameter Optimization

Due to the high computational demands executing a rigorous comparison be...