HPOBench: A Collection of Reproducible Multi-Fidelity Benchmark Problems for HPO

09/14/2021
by   Katharina Eggensperger, et al.
22

To achieve peak predictive performance, hyperparameter optimization (HPO) is a crucial component of machine learning and its applications. Over the last years,the number of efficient algorithms and tools for HPO grew substantially. At the same time, the community is still lacking realistic, diverse, computationally cheap,and standardized benchmarks. This is especially the case for multi-fidelity HPO methods. To close this gap, we propose HPOBench, which includes 7 existing and 5 new benchmark families, with in total more than 100 multi-fidelity benchmark problems. HPOBench allows to run this extendable set of multi-fidelity HPO benchmarks in a reproducible way by isolating and packaging the individual benchmarks in containers. It also provides surrogate and tabular benchmarks for computationally affordable yet statistically sound evaluations. To demonstrate the broad compatibility of HPOBench and its usefulness, we conduct an exemplary large-scale study evaluating 6 well known multi-fidelity HPO tools.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/28/2023

Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization?

Hyperparameter optimization (HPO) is crucial for fine-tuning machine lea...
research
08/11/2022

A Principled Method for the Creation of Synthetic Multi-fidelity Data Sets

Multifidelity and multioutput optimisation algorithms are of active inte...
research
09/27/2022

Multi-Stage Multi-Fidelity Gaussian Process Modeling, with Application to Heavy-Ion Collisions

In an era where scientific experimentation is often costly, multi-fideli...
research
05/27/2023

Python Wrapper for Simulating Multi-Fidelity Optimization on HPO Benchmarks without Any Wait

Hyperparameter (HP) optimization of deep learning (DL) is essential for ...
research
09/08/2021

YAHPO Gym – Design Criteria and a new Multifidelity Benchmark for Hyperparameter Optimization

When developing and analyzing new hyperparameter optimization (HPO) meth...
research
03/30/2017

Efficient Benchmarking of Algorithm Configuration Procedures via Model-Based Surrogates

The optimization of algorithm (hyper-)parameters is crucial for achievin...
research
05/30/2019

Meta-Surrogate Benchmarking for Hyperparameter Optimization

Despite the recent progress in hyperparameter optimization (HPO), availa...

Please sign up or login with your details

Forgot password? Click here to reset