Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges

07/13/2021
by   Bernd Bischl, et al.
138

Most machine learning algorithms are configured by one or several hyperparameters that must be carefully chosen and often considerably impact performance. To avoid a time consuming and unreproducible manual trial-and-error process to find well-performing hyperparameter configurations, various automatic hyperparameter optimization (HPO) methods, e.g., based on resampling error estimation for supervised machine learning, can be employed. After introducing HPO from a general perspective, this paper reviews important HPO methods such as grid or random search, evolutionary algorithms, Bayesian optimization, Hyperband and racing. It gives practical recommendations regarding important choices to be made when conducting HPO, including the HPO algorithms themselves, performance evaluation, how to combine HPO with ML pipelines, runtime improvements, and parallelization.

READ FULL TEXT

page 11

page 23

page 27

page 28

research
02/10/2018

Bayesian Optimization Using Monotonicity Information and Its Application in Machine Learning Hyperparameter

We propose an algorithm for a family of optimization problems where the ...
research
11/29/2021

Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers

Automated hyperparameter optimization (HPO) has gained great popularity ...
research
02/25/2019

Quantifying error contributions of computational steps, algorithms and hyperparameter choices in image classification pipelines

Data science relies on pipelines that are organized in the form of inter...
research
09/20/2021

SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization

Algorithm parameters, in particular hyperparameters of machine learning ...
research
05/20/2021

DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization

Modern machine learning algorithms crucially rely on several design deci...
research
02/06/2020

One-Shot Bayes Opt with Probabilistic Population Based Training

Selecting optimal hyperparameters is a key challenge in machine learning...

Please sign up or login with your details

Forgot password? Click here to reset