Learning Multiple Defaults for Machine Learning Algorithms

11/23/2018
by   Florian Pfisterer, et al.
0

The performance of modern machine learning methods highly depends on their hyperparameter configurations. One simple way of selecting a configuration is to use default settings, often proposed along with the publication and implementation of a new algorithm. Those default values are usually chosen in an ad-hoc manner to work good enough on a wide variety of datasets. To address this problem, different automatic hyperparameter configuration algorithms have been proposed, which select an optimal configuration per dataset. This principled approach usually improves performance, but adds additional algorithmic complexity and computational costs to the training procedure. As an alternative to this, we propose learning a set of complementary default values from a large database of prior empirical results. Selecting an appropriate configuration on a new dataset then requires only a simple, efficient and embarrassingly parallel search over this set. We demonstrate the effectiveness and efficiency of the approach we propose in comparison to random search and Bayesian Optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2021

Meta-Learning for Symbolic Hyperparameter Defaults

Hyperparameter optimization in machine learning (ML) deals with the prob...
research
12/26/2016

Clustering Algorithms: A Comparative Approach

Many real-world systems can be studied in terms of pattern recognition t...
research
07/13/2018

Tune: A Research Platform for Distributed Model Selection and Training

Modern machine learning algorithms are increasingly computationally dema...
research
02/20/2022

Mining Robust Default Configurations for Resource-constrained AutoML

Automatic machine learning (AutoML) is a key enabler of the mass deploym...
research
07/11/2017

Hot-Rodding the Browser Engine: Automatic Configuration of JavaScript Compilers

Modern software systems in many application areas offer to the user a mu...
research
11/29/2021

Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers

Automated hyperparameter optimization (HPO) has gained great popularity ...
research
05/20/2021

DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization

Modern machine learning algorithms crucially rely on several design deci...

Please sign up or login with your details

Forgot password? Click here to reset