On Provably Robust Meta-Bayesian Optimization

06/14/2022
by   Zhongxiang Dai, et al.
0

Bayesian optimization (BO) has become popular for sequential optimization of black-box functions. When BO is used to optimize a target function, we often have access to previous evaluations of potentially related functions. This begs the question as to whether we can leverage these previous experiences to accelerate the current BO task through meta-learning (meta-BO), while ensuring robustness against potentially harmful dissimilar tasks that could sabotage the convergence of BO. This paper introduces two scalable and provably robust meta-BO algorithms: robust meta-Gaussian process-upper confidence bound (RM-GP-UCB) and RM-GP-Thompson sampling (RM-GP-TS). We prove that both algorithms are asymptotically no-regret even when some or all previous tasks are dissimilar to the current task, and show that RM-GP-UCB enjoys a better theoretical robustness than RM-GP-TS. We also exploit the theoretical guarantees to optimize the weights assigned to individual previous tasks through regret minimization via online learning, which diminishes the impact of dissimilar tasks and hence further enhances the robustness. Empirical evaluations show that (a) RM-GP-UCB performs effectively and consistently across various applications, and (b) RM-GP-TS, despite being less robust than RM-GP-UCB both in theory and in practice, performs competitively in some scenarios with less dissimilar tasks and is more computationally efficient.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2018

Regret bounds for meta Bayesian optimization with an unknown Gaussian process prior

Bayesian optimization usually assumes that a Bayesian prior is given. Ho...
research
02/11/2019

Harnessing Low-Fidelity Data to Accelerate Bayesian Optimization via Posterior Regularization

Bayesian optimization (BO) is a powerful derivative-free technique for g...
research
10/13/2022

Sample-Then-Optimize Batch Neural Thompson Sampling

Bayesian optimization (BO), which uses a Gaussian process (GP) as a surr...
research
03/17/2021

Efficient Bayesian Optimization using Multiscale Graph Correlation

Bayesian optimization is a powerful tool to optimize a black-box functio...
research
01/30/2022

Scaling Gaussian Process Optimization by Evaluating a Few Unique Candidates Multiple Times

Computing a Gaussian process (GP) posterior has a computational cost cub...
research
06/12/2022

A Probabilistic Machine Learning Approach to Scheduling Parallel Loops with Bayesian Optimization

This paper proposes Bayesian optimization augmented factoring self-sched...
research
06/02/2021

JUMBO: Scalable Multi-task Bayesian Optimization using Offline Data

The goal of Multi-task Bayesian Optimization (MBO) is to minimize the nu...

Please sign up or login with your details

Forgot password? Click here to reset