Local Latent Space Bayesian Optimization over Structured Inputs

01/28/2022
by   Natalie Maus, et al.
0

Bayesian optimization over the latent spaces of deep autoencoder models (DAEs) has recently emerged as a promising new approach for optimizing challenging black-box functions over structured, discrete, hard-to-enumerate search spaces (e.g., molecules). Here the DAE dramatically simplifies the search space by mapping inputs into a continuous latent space where familiar Bayesian optimization tools can be more readily applied. Despite this simplification, the latent space typically remains high-dimensional. Thus, even with a well-suited latent space, these approaches do not necessarily provide a complete solution, but may rather shift the structured optimization problem to a high-dimensional one. In this paper, we propose LOL-BO, which adapts the notion of trust regions explored in recent work on high-dimensional Bayesian optimization to the structured setting. By reformulating the encoder to function as both an encoder for the DAE globally and as a deep kernel for the surrogate model within a trust region, we better align the notion of local optimization in the latent space with local optimization in the input space. LOL-BO achieves as much as 20 times improvement over state-of-the-art latent space Bayesian optimization methods across six real-world benchmarks, demonstrating that improvement in optimization strategies is as important as developing better DAE models.

READ FULL TEXT
research
11/01/2021

Combining Latent Space and Structured Kernels for Bayesian Optimization over Combinatorial Spaces

We consider the problem of optimizing combinatorial spaces (e.g., sequen...
research
12/31/2020

Good practices for Bayesian Optimization of high dimensional structured spaces

The increasing availability of structured but high dimensional data has ...
research
03/16/2022

Learning Representation for Bayesian Optimization with Collision-free Regularization

Bayesian optimization has been challenged by datasets with large-scale, ...
research
06/26/2019

Modulated Bayesian Optimization using Latent Gaussian Process Models

We present an approach to Bayesian Optimization that allows for robust s...
research
09/23/2022

GLSO: Grammar-guided Latent Space Optimization for Sample-efficient Robot Design Automation

Robots have been used in all sorts of automation, and yet the design of ...
research
10/07/2020

Generative Melody Composition with Human-in-the-Loop Bayesian Optimization

Deep generative models allow even novice composers to generate various m...
research
05/03/2022

Learning Discrete Structured Variational Auto-Encoder using Natural Evolution Strategies

Discrete variational auto-encoders (VAEs) are able to represent semantic...

Please sign up or login with your details

Forgot password? Click here to reset