PAC-BUS: Meta-Learning Bounds via PAC-Bayes and Uniform Stability

02/12/2021
by   Alec Farid, et al.
0

We are motivated by the problem of providing strong generalization guarantees in the context of meta-learning. Existing generalization bounds are either challenging to evaluate or provide vacuous guarantees in even relatively simple settings. We derive a probably approximately correct (PAC) bound for gradient-based meta-learning using two different generalization frameworks in order to deal with the qualitatively different challenges of generalization at the "base" and "meta" levels. We employ bounds for uniformly stable algorithms at the base level and bounds from the PAC-Bayes framework at the meta level. The result is a PAC-bound that is tighter when the base learner adapts quickly, which is precisely the goal of meta-learning. We show that our bound provides a tighter guarantee than other bounds on a toy non-convex problem on the unit sphere and a text-based classification example. We also present a practical regularization scheme motivated by the bound in settings where the bound is loose and demonstrate improved performance over baseline techniques.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2021

PAC-Bayes Bounds for Meta-learning with Data-Dependent Prior

By leveraging experience from previous tasks, meta-learning algorithms c...
research
03/30/2022

Higher-Order Generalization Bounds: Learning Deep Probabilistic Programs via PAC-Bayes Objectives

Deep Probabilistic Programming (DPP) allows powerful models based on rec...
research
06/11/2022

A General framework for PAC-Bayes Bounds for Meta-Learning

Meta learning automatically infers an inductive bias, that includes the ...
research
02/13/2020

PACOH: Bayes-Optimal Meta-Learning with PAC-Guarantees

Meta-learning can successfully acquire useful inductive biases from data...
research
11/14/2022

PAC-Bayesian Meta-Learning: From Theory to Practice

Meta-Learning aims to accelerate the learning on new tasks by acquiring ...
research
02/23/2023

Bayes meets Bernstein at the Meta Level: an Analysis of Fast Rates in Meta-Learning with PAC-Bayes

Bernstein's condition is a key assumption that guarantees fast rates in ...
research
07/06/2022

PAC Prediction Sets for Meta-Learning

Uncertainty quantification is a key component of machine learning models...

Please sign up or login with your details

Forgot password? Click here to reset