Transfer Bayesian Meta-learning via Weighted Free Energy Minimization

06/20/2021
by   Yunchuan Zhang, et al.
0

Meta-learning optimizes the hyperparameters of a training procedure, such as its initialization, kernel, or learning rate, based on data sampled from a number of auxiliary tasks. A key underlying assumption is that the auxiliary tasks, known as meta-training tasks, share the same generating distribution as the tasks to be encountered at deployment time, known as meta-test tasks. This may, however, not be the case when the test environment differ from the meta-training conditions. To address shifts in task generating distribution between meta-training and meta-testing phases, this paper introduces weighted free energy minimization (WFEM) for transfer meta-learning. We instantiate the proposed approach for non-parametric Bayesian regression and classification via Gaussian Processes (GPs). The method is validated on a toy sinusoidal regression problem, as well as on classification using miniImagenet and CUB data sets, through comparison with standard meta-learning of GP priors as implemented by PACOH.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/04/2020

Transfer Meta-Learning: Information-Theoretic Bounds and Information Meta-Risk Minimization

Meta-learning automatically infers an inductive bias by observing data f...
research
06/09/2022

Learning to generate imaginary tasks for improving generalization in meta-learning

The success of meta-learning on existing benchmarks is predicated on the...
research
10/26/2021

Meta-learning with an Adaptive Task Scheduler

To benefit the learning of a new task, meta-learning has been proposed t...
research
07/24/2018

Meta-Learning Priors for Efficient Online Bayesian Regression

Gaussian Process (GP) regression has seen widespread use in robotics due...
research
02/26/2020

Adversarial Monte Carlo Meta-Learning of Optimal Prediction Procedures

We frame the meta-learning of prediction procedures as a search for an o...
research
02/04/2022

Distribution Embedding Networks for Meta-Learning with Heterogeneous Covariate Spaces

We propose Distribution Embedding Networks (DEN) for classification with...
research
02/23/2023

Bayes meets Bernstein at the Meta Level: an Analysis of Fast Rates in Meta-Learning with PAC-Bayes

Bernstein's condition is a key assumption that guarantees fast rates in ...

Please sign up or login with your details

Forgot password? Click here to reset