DeepAI AI Chat
Log In Sign Up

Modulated Bayesian Optimization using Latent Gaussian Process Models

by   Erik Bodin, et al.

We present an approach to Bayesian Optimization that allows for robust search strategies over a large class of challenging functions. Our method is motivated by the belief that the trends useful to exploit in search of the optimum typically are a subset of the characteristics of the true objective function. At the core of our approach is the use of a Latent Gaussian Process Regression model that allows us to modulate the input domain with an orthogonal latent space. Using this latent space we can encapsulate local information about each observed data point that can be used to guide the search problem. We show experimentally that our method can be used to significantly improve performance on challenging benchmarks.


page 6

page 8


Local Latent Space Bayesian Optimization over Structured Inputs

Bayesian optimization over the latent spaces of deep autoencoder models ...

Learning Representation for Bayesian Optimization with Collision-free Regularization

Bayesian optimization has been challenged by datasets with large-scale, ...

Noisy-Input Entropy Search for Efficient Robust Bayesian Optimization

We consider the problem of robust optimization within the well-establish...

Hyper-optimization with Gaussian Process and Differential Evolution Algorithm

Optimization of problems with high computational power demands is a chal...

UAVs using Bayesian Optimization to Locate WiFi Devices

We address the problem of localizing non-collaborative WiFi devices in a...

Hybrid Repeat/Multi-point Sampling for Highly Volatile Objective Functions

A key drawback of the current generation of artificial decision-makers i...

Bayesian Optimization Approach for Analog Circuit Synthesis Using Neural Network

Bayesian optimization with Gaussian process as surrogate model has been ...