Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization

03/21/2016
by   Lisha Li, et al.
0

Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While current methods offer efficiencies by adaptively choosing new configurations to train, an alternative strategy is to adaptively allocate resources across the selected configurations. We formulate hyperparameter optimization as a pure-exploration non-stochastic infinitely many armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations. We introduce Hyperband for this framework and analyze its theoretical properties, providing several desirable guarantees. Furthermore, we compare Hyperband with state-of-the-art methods on a suite of hyperparameter optimization problems. We observe that Hyperband provides five times to thirty times speedup over state-of-the-art Bayesian optimization algorithms on a variety of deep-learning and kernel-based learning problems.

READ FULL TEXT
research
07/04/2018

BOHB: Robust and Efficient Hyperparameter Optimization at Scale

Modern deep learning methods are very sensitive to many hyperparameters,...
research
01/26/2019

A Practical Bandit Method with Advantages in Neural Network Tuning

Stochastic bandit algorithms can be used for challenging non-convex opti...
research
08/05/2021

HyperJump: Accelerating HyperBand via Risk Modelling

In the literature on hyper-parameter tuning, a number of recent solution...
research
10/08/2018

CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms

Many hyperparameter optimization (HyperOpt) methods assume restricted co...
research
02/18/2023

Online Continuous Hyperparameter Optimization for Contextual Bandits

In stochastic contextual bandit problems, an agent sequentially makes ac...
research
02/03/2023

A Lipschitz Bandits Approach for Continuous Hyperparameter Optimization

One of the most critical problems in machine learning is HyperParameter ...
research
06/25/2020

Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems

We provide the first global model recovery results for the IRLS (iterati...

Please sign up or login with your details

Forgot password? Click here to reset