BOAH: A Tool Suite for Multi-Fidelity Bayesian Optimization & Analysis of Hyperparameters

08/16/2019
by   Marius Lindauer, et al.
0

Hyperparameter optimization and neural architecture search can become prohibitively expensive for regular black-box Bayesian optimization because the training and evaluation of a single model can easily take several hours. To overcome this, we introduce a comprehensive tool suite for effective multi-fidelity Bayesian optimization and the analysis of its runs. The suite, written in Python, provides a simple way to specify complex design spaces, a robust and efficient combination of Bayesian optimization and HyperBand, and a comprehensive analysis of the optimization process and its outcomes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/14/2019

Deep Neural Architecture Search with Deep Graph Bayesian Optimization

Bayesian optimization (BO) is an effective method of finding the global ...
research
11/11/2022

Combining Multi-Fidelity Modelling and Asynchronous Batch Bayesian Optimization

Bayesian Optimization is a useful tool for experiment design. Unfortunat...
research
12/12/2019

CAD Tool Design Space Exploration via Bayesian Optimization

The design complexity is increasing as the technology node keeps scaling...
research
10/22/2022

Bayesian Optimization with Conformal Coverage Guarantees

Bayesian optimization is a coherent, ubiquitous approach to decision-mak...
research
08/14/2015

Unbounded Bayesian Optimization via Regularization

Bayesian optimization has recently emerged as a popular and efficient to...
research
06/18/2021

Batch Multi-Fidelity Bayesian Optimization with Deep Auto-Regressive Networks

Bayesian optimization (BO) is a powerful approach for optimizing black-b...
research
04/22/2023

Increasing the Scope as You Learn: Adaptive Bayesian Optimization in Nested Subspaces

Recent advances have extended the scope of Bayesian optimization (BO) to...

Please sign up or login with your details

Forgot password? Click here to reset