Meta-Learning for Symbolic Hyperparameter Defaults

06/10/2021
by   Pieter Gijsbers, et al.
0

Hyperparameter optimization in machine learning (ML) deals with the problem of empirically learning an optimal algorithm configuration from data, usually formulated as a black-box optimization problem. In this work, we propose a zero-shot method to meta-learn symbolic default hyperparameter configurations that are expressed in terms of the properties of the dataset. This enables a much faster, but still data-dependent, configuration of the ML algorithm, compared to standard hyperparameter optimization approaches. In the past, symbolic and static default values have usually been obtained as hand-crafted heuristics. We propose an approach of learning such symbolic configurations as formulas of dataset properties from a large set of prior evaluations on multiple datasets by optimizing over a grammar of expressions using an evolutionary algorithm. We evaluate our method on surrogate empirical performance models as well as on real data across 6 ML algorithms on more than 100 datasets and demonstrate that our method indeed finds viable symbolic defaults.

READ FULL TEXT
research
11/23/2018

Learning Multiple Defaults for Machine Learning Algorithms

The performance of modern machine learning methods highly depends on the...
research
02/20/2022

Mining Robust Default Configurations for Resource-constrained AutoML

Automatic machine learning (AutoML) is a key enabler of the mass deploym...
research
12/11/2020

Better call Surrogates: A hybrid Evolutionary Algorithm for Hyperparameter optimization

In this paper, we propose a surrogate-assisted evolutionary algorithm (E...
research
06/25/2022

Binary and Multinomial Classification through Evolutionary Symbolic Regression

We present three evolutionary symbolic regression-based classification a...
research
11/20/2020

Hyperparameter Optimization for AST Differencing

Computing the differences between two versions of the same program is an...
research
07/27/2020

Practical and sample efficient zero-shot HPO

Zero-shot hyperparameter optimization (HPO) is a simple yet effective us...
research
06/11/2021

HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML

Hyperparameter optimization (HPO) is a core problem for the machine lear...

Please sign up or login with your details

Forgot password? Click here to reset