DeepAI AI Chat
Log In Sign Up

JUMBO: Scalable Multi-task Bayesian Optimization using Offline Data

06/02/2021
by   Kourosh Hakhamaneshi, et al.
0

The goal of Multi-task Bayesian Optimization (MBO) is to minimize the number of queries required to accurately optimize a target black-box function, given access to offline evaluations of other auxiliary functions. When offline datasets are large, the scalability of prior approaches comes at the expense of expressivity and inference quality. We propose JUMBO, an MBO algorithm that sidesteps these limitations by querying additional data based on a combination of acquisition signals derived from training two Gaussian Processes (GP): a cold-GP operating directly in the input domain and a warm-GP that operates in the feature space of a deep neural network pretrained using the offline data. Such a decomposition can dynamically control the reliability of information derived from the online and offline data and the use of pretrained neural networks permits scalability to large offline datasets. Theoretically, we derive regret bounds for JUMBO and show that it achieves no-regret under conditions analogous to GP-UCB (Srinivas et. al. 2010). Empirically, we demonstrate significant performance improvements over existing approaches on two real-world optimization problems: hyper-parameter optimization and automated circuit design.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/23/2018

Regret bounds for meta Bayesian optimization with an unknown Gaussian process prior

Bayesian optimization usually assumes that a Bayesian prior is given. Ho...
06/22/2020

MUMBO: MUlti-task Max-value Bayesian Optimization

We propose MUMBO, the first high-performing yet computationally efficien...
04/01/2019

Bayesian Optimization for Policy Search via Online-Offline Experimentation

Online field experiments are the gold-standard way of evaluating changes...
06/24/2021

Bayesian Optimization with High-Dimensional Outputs

Bayesian Optimization is a sample-efficient black-box optimization proce...
10/13/2022

Sample-Then-Optimize Batch Neural Thompson Sampling

Bayesian optimization (BO), which uses a Gaussian process (GP) as a surr...
11/02/2018

A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes

How can we efficiently gather information to optimize an unknown functio...
10/13/2020

Local Differential Privacy for Bayesian Optimization

Motivated by the increasing concern about privacy in nowadays data-inten...