Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers

11/29/2021
by   Julia Moosbauer, et al.
0

Automated hyperparameter optimization (HPO) has gained great popularity and is an important ingredient of most automated machine learning frameworks. The process of designing HPO algorithms, however, is still an unsystematic and manual process: Limitations of prior work are identified and the improvements proposed are – even though guided by expert knowledge – still somewhat arbitrary. This rarely allows for gaining a holistic understanding of which algorithmic components are driving performance, and carries the risk of overlooking good algorithmic design choices. We present a principled approach to automated benchmark-driven algorithm design applied to multifidelity HPO (MF-HPO): First, we formalize a rich space of MF-HPO candidates that includes, but is not limited to common HPO algorithms, and then present a configurable framework covering this space. To find the best candidate automatically and systematically, we follow a programming-by-optimization approach and search over the space of algorithm candidates via Bayesian optimization. We challenge whether the found design choices are necessary or could be replaced by more naive and simpler ones by performing an ablation analysis. We observe that using a relatively simple configuration, in some ways simpler than established methods, performs very well as long as some critical configuration parameters have the right value.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2021

Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges

Most machine learning algorithms are configured by one or several hyperp...
research
01/12/2020

Adaptive Expansion Bayesian Optimization for Unbounded Global Optimization

Bayesian optimization is normally performed within fixed variable bounds...
research
11/23/2018

Learning Multiple Defaults for Machine Learning Algorithms

The performance of modern machine learning methods highly depends on the...
research
03/12/2023

AutoOptLib: A Library of Automatically Designing Metaheuristic Optimization Algorithms in MATLAB

Metaheuristic algorithms are widely-recognized solvers for challenging o...
research
08/07/2023

HomOpt: A Homotopy-Based Hyperparameter Optimization Method

Machine learning has achieved remarkable success over the past couple of...
research
11/03/2022

Revisiting Hyperparameter Tuning with Differential Privacy

Hyperparameter tuning is a common practice in the application of machine...
research
04/20/2023

PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces

The recent rise in popularity of Hyperparameter Optimization (HPO) for d...

Please sign up or login with your details

Forgot password? Click here to reset