Nonlinear Meta-Learning Can Guarantee Faster Rates

07/20/2023
by   Dimitri Meunier, et al.
0

Many recent theoretical works on meta-learning aim to achieve guarantees in leveraging similar representational structures from related tasks towards simplifying a target task. Importantly, the main aim in theory works on the subject is to understand the extent to which convergence rates – in learning a common representation – may scale with the number N of tasks (as well as the number of samples per task). First steps in this setting demonstrate this property when both the shared representation amongst tasks, and task-specific regression functions, are linear. This linear setting readily reveals the benefits of aggregating tasks, e.g., via averaging arguments. In practice, however, the representation is often highly nonlinear, introducing nontrivial biases in each task that cannot easily be averaged out as in the linear case. In the present work, we derive theoretical guarantees for meta-learning with nonlinear representations. In particular, assuming the shared nonlinearity maps to an infinite-dimensional RKHS, we show that additional biases can be mitigated with careful regularization that leverages the smoothness of task-specific regression functions,

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2021

Conditional Meta-Learning of Linear Representations

Standard meta-learning for representation learning aims to find a common...
research
10/19/2020

Meta-learning the Learning Trends Shared Across Tasks

Meta-learning stands for 'learning to learn' such that generalization to...
research
05/18/2021

Sample Efficient Linear Meta-Learning by Alternating Minimization

Meta-learning synthesizes and leverages the knowledge from a given set o...
research
01/17/2023

Convergence of First-Order Algorithms for Meta-Learning with Moreau Envelopes

In this work, we consider the problem of minimizing the sum of Moreau en...
research
02/14/2022

Trace norm regularization for multi-task learning with scarce data

Multi-task learning leverages structural similarities between multiple t...
research
03/08/2023

Provable Pathways: Learning Multiple Tasks over Multiple Paths

Constructing useful representations across a large number of tasks is a ...
research
02/14/2021

Sample Efficient Subspace-based Representations for Nonlinear Meta-Learning

Constructing good representations is critical for learning complex tasks...

Please sign up or login with your details

Forgot password? Click here to reset