PAC-Bayesian Meta-Learning: From Theory to Practice

11/14/2022
by   Jonas Rothfuss, et al.
0

Meta-Learning aims to accelerate the learning on new tasks by acquiring useful inductive biases from related data sources. In practice, the number of tasks available for meta-learning is often small. Yet, most of the existing approaches rely on an abundance of meta-training tasks, making them prone to overfitting. How to regularize the meta-learner to ensure generalization to unseen tasks, is a central question in the literature. We provide a theoretical analysis using the PAC-Bayesian framework and derive the first bound for meta-learners with unbounded loss functions. Crucially, our bounds allow us to derive the PAC-optimal hyper-posterior (PACOH) - the closed-form-solution of the PAC-Bayesian meta-learning problem, thereby avoiding the reliance on nested optimization, giving rise to an optimization problem amenable to standard variational methods that scale well. Our experiments show that, when instantiating the PACOH with Gaussian processes and Bayesian Neural Networks as base learners, the resulting methods are more scalable, and yield state-of-the-art performance, both in terms of predictive accuracy and the quality of uncertainty estimates. Finally, thanks to the principled treatment of uncertainty, our meta-learners can also be successfully employed for sequential decision problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/13/2020

PACOH: Bayes-Optimal Meta-Learning with PAC-Guarantees

Meta-learning can successfully acquire useful inductive biases from data...
research
06/11/2022

A General framework for PAC-Bayes Bounds for Meta-Learning

Meta learning automatically infers an inductive bias, that includes the ...
research
02/12/2021

PAC-BUS: Meta-Learning Bounds via PAC-Bayes and Uniform Stability

We are motivated by the problem of providing strong generalization guara...
research
10/24/2022

MARS: Meta-Learning as Score Matching in the Function Space

Meta-learning aims to extract useful inductive biases from a set of rela...
research
07/09/2022

Transformer Neural Processes: Uncertainty-Aware Meta Learning Via Sequence Modeling

Neural Processes (NPs) are a popular class of approaches for meta-learni...
research
01/26/2021

Nonparametric Estimation of Heterogeneous Treatment Effects: From Theory to Learning Algorithms

The need to evaluate treatment effectiveness is ubiquitous in most of em...
research
12/06/2021

Noether Networks: Meta-Learning Useful Conserved Quantities

Progress in machine learning (ML) stems from a combination of data avail...

Please sign up or login with your details

Forgot password? Click here to reset