Meta-Surrogate Benchmarking for Hyperparameter Optimization

05/30/2019
by   Aaron Klein, et al.
0

Despite the recent progress in hyperparameter optimization (HPO), available benchmarks that resemble real-world scenarios consist of a few and very large problem instances that are expensive to solve. This blocks researchers and practitioners not only from systematically running large-scale comparisons that are needed to draw statistically significant results but also from reproducing experiments that were conducted before. This work proposes a method to alleviate these issues by means of a meta-surrogate model for HPO tasks trained on off-line generated data. The model combines a probabilistic encoder with a multi-task model such that it can generate inexpensive and realistic tasks of the class of problems of interest. We demonstrate that benchmarking HPO methods on samples of the generative model allows us to draw more coherent and statistically significant conclusions that can be reached orders of magnitude faster than using the original tasks. We provide evidence of our findings for various HPO methods on a wide class of problems.

READ FULL TEXT

page 7

page 16

page 17

research
03/30/2017

Efficient Benchmarking of Algorithm Configuration Procedures via Model-Based Surrogates

The optimization of algorithm (hyper-)parameters is crucial for achievin...
research
06/08/2021

EXPObench: Benchmarking Surrogate-based Optimisation Algorithms on Expensive Black-box Functions

Surrogate algorithms such as Bayesian optimisation are especially design...
research
02/07/2021

Hyperparameter Optimization with Differentiable Metafeatures

Metafeatures, or dataset characteristics, have been shown to improve the...
research
10/15/2021

Improving Hyperparameter Optimization by Planning Ahead

Hyperparameter optimization (HPO) is generally treated as a bi-level opt...
research
10/04/2021

HYPPO: A Surrogate-Based Multi-Level Parallelism Tool for Hyperparameter Optimization

We present a new software, HYPPO, that enables the automatic tuning of h...
research
09/14/2021

HPOBench: A Collection of Reproducible Multi-Fidelity Benchmark Problems for HPO

To achieve peak predictive performance, hyperparameter optimization (HPO...
research
07/11/2022

Start Small, Think Big: On Hyperparameter Optimization for Large-Scale Knowledge Graph Embeddings

Knowledge graph embedding (KGE) models are an effective and popular appr...

Please sign up or login with your details

Forgot password? Click here to reset