Unbounded Bayesian Optimization via Regularization

08/14/2015
by   Bobak Shahriari, et al.
0

Bayesian optimization has recently emerged as a popular and efficient tool for global optimization and hyperparameter tuning. Currently, the established Bayesian optimization practice requires a user-defined bounding box which is assumed to contain the optimizer. However, when little is known about the probed objective function, it can be difficult to prescribe such bounds. In this work we modify the standard Bayesian optimization framework in a principled way to allow automatic resizing of the search space. We introduce two alternative methods and compare them on two common synthetic benchmarking test functions as well as the tasks of tuning the stochastic gradient descent optimizer of a multi-layered perceptron and a convolutional neural network on MNIST.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/12/2020

Adaptive Expansion Bayesian Optimization for Unbounded Global Optimization

Bayesian optimization is normally performed within fixed variable bounds...
research
12/28/2016

Bayesian Optimization with Shape Constraints

In typical applications of Bayesian optimization, minimal assumptions ar...
research
08/16/2019

BOAH: A Tool Suite for Multi-Fidelity Bayesian Optimization & Analysis of Hyperparameters

Hyperparameter optimization and neural architecture search can become pr...
research
06/12/2020

An efficient application of Bayesian optimization to an industrial MDO framework for aircraft design

The multi-level, multi-disciplinary and multi-fidelity optimization fram...
research
06/22/2019

Bayesian Optimization with Directionally Constrained Search

Bayesian optimization offers a flexible framework to optimize an objecti...
research
10/11/2019

On Empirical Comparisons of Optimizers for Deep Learning

Selecting an optimizer is a central step in the contemporary deep learni...
research
09/16/2021

Automatic prior selection for meta Bayesian optimization with a case study on tuning deep neural network optimizers

The performance of deep neural networks can be highly sensitive to the c...

Please sign up or login with your details

Forgot password? Click here to reset