Meta-Learning for Symbolic Hyperparameter Defaults

by   Pieter Gijsbers, et al.

Hyperparameter optimization in machine learning (ML) deals with the problem of empirically learning an optimal algorithm configuration from data, usually formulated as a black-box optimization problem. In this work, we propose a zero-shot method to meta-learn symbolic default hyperparameter configurations that are expressed in terms of the properties of the dataset. This enables a much faster, but still data-dependent, configuration of the ML algorithm, compared to standard hyperparameter optimization approaches. In the past, symbolic and static default values have usually been obtained as hand-crafted heuristics. We propose an approach of learning such symbolic configurations as formulas of dataset properties from a large set of prior evaluations on multiple datasets by optimizing over a grammar of expressions using an evolutionary algorithm. We evaluate our method on surrogate empirical performance models as well as on real data across 6 ML algorithms on more than 100 datasets and demonstrate that our method indeed finds viable symbolic defaults.


Learning Multiple Defaults for Machine Learning Algorithms

The performance of modern machine learning methods highly depends on the...

Mining Robust Default Configurations for Resource-constrained AutoML

Automatic machine learning (AutoML) is a key enabler of the mass deploym...

Better call Surrogates: A hybrid Evolutionary Algorithm for Hyperparameter optimization

In this paper, we propose a surrogate-assisted evolutionary algorithm (E...

Binary and Multinomial Classification through Evolutionary Symbolic Regression

We present three evolutionary symbolic regression-based classification a...

Hyperparameter Optimization for AST Differencing

Computing the differences between two versions of the same program is an...

Practical and sample efficient zero-shot HPO

Zero-shot hyperparameter optimization (HPO) is a simple yet effective us...

HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML

Hyperparameter optimization (HPO) is a core problem for the machine lear...

Please sign up or login with your details

Forgot password? Click here to reset