Bayes meets Bernstein at the Meta Level: an Analysis of Fast Rates in Meta-Learning with PAC-Bayes

02/23/2023
by   Charles Riou, et al.
0

Bernstein's condition is a key assumption that guarantees fast rates in machine learning. For example, the Gibbs algorithm with prior π has an excess risk in O(d_π/n), as opposed to the standard O(√(d_π/n)), where n denotes the number of observations and d_π is a complexity parameter which depends on the prior π. In this paper, we examine the Gibbs algorithm in the context of meta-learning, i.e., when learning the prior π from T tasks (with n observations each) generated by a meta distribution. Our main result is that Bernstein's condition always holds at the meta level, regardless of its validity at the observation level. This implies that the additional cost to learn the Gibbs prior π, which will reduce the term d_π across tasks, is in O(1/T), instead of the expected O(1/√(T)). We further illustrate how this result improves on standard rates in three different settings: discrete priors, Gaussian priors and mixture of Gaussians priors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2021

PAC-Bayes Bounds for Meta-learning with Data-Dependent Prior

By leveraging experience from previous tasks, meta-learning algorithms c...
research
06/11/2022

A General framework for PAC-Bayes Bounds for Meta-Learning

Meta learning automatically infers an inductive bias, that includes the ...
research
02/12/2021

PAC-BUS: Meta-Learning Bounds via PAC-Bayes and Uniform Stability

We are motivated by the problem of providing strong generalization guara...
research
04/27/2023

On the Generalization Error of Meta Learning for the Gibbs Algorithm

We analyze the generalization ability of joint-training meta learning al...
research
12/24/2019

Meta-Learning PAC-Bayes Priors in Model Averaging

Nowadays model uncertainty has become one of the most important problems...
research
06/20/2021

Transfer Bayesian Meta-learning via Weighted Free Energy Minimization

Meta-learning optimizes the hyperparameters of a training procedure, suc...
research
10/12/2020

How Important is the Train-Validation Split in Meta-Learning?

Meta-learning aims to perform fast adaptation on a new task through lear...

Please sign up or login with your details

Forgot password? Click here to reset