Towards Sample-efficient Overparameterized Meta-learning

01/16/2022
by   Yue Sun, et al.
0

An overarching goal in machine learning is to build a generalizable model with few samples. To this end, overparameterization has been the subject of immense interest to explain the generalization ability of deep nets even when the size of the dataset is smaller than that of the model. While the prior literature focuses on the classical supervised setting, this paper aims to demystify overparameterization for meta-learning. Here we have a sequence of linear-regression tasks and we ask: (1) Given earlier tasks, what is the optimal linear representation of features for a new downstream task? and (2) How many samples do we need to build this representation? This work shows that surprisingly, overparameterization arises as a natural answer to these fundamental meta-learning questions. Specifically, for (1), we first show that learning the optimal representation coincides with the problem of designing a task-aware regularization to promote inductive bias. We leverage this inductive bias to explain how the downstream task actually benefits from overparameterization, in contrast to prior works on few-shot learning. For (2), we develop a theory to explain how feature covariance can implicitly help reduce the sample complexity well below the degrees of freedom and lead to small estimation error. We then integrate these findings to obtain an overall performance guarantee for our meta-learning algorithm. Numerical experiments on real and synthetic data verify our insights on overparameterized meta-learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2020

The Sample Complexity of Meta Sparse Regression

This paper addresses the meta-learning problem in sparse linear regressi...
research
12/14/2020

Variable-Shot Adaptation for Online Meta-Learning

Few-shot meta-learning methods consider the problem of learning new task...
research
02/25/2020

A Sample Complexity Separation between Non-Convex and Convex Meta-Learning

One popular trend in meta-learning is to learn from many training tasks ...
research
02/09/2020

Local Nonparametric Meta-Learning

A central goal of meta-learning is to find a learning rule that enables ...
research
08/08/2023

Meta-Learning Operators to Optimality from Multi-Task Non-IID Data

A powerful concept behind much of the recent progress in machine learnin...
research
02/14/2021

Sample Efficient Subspace-based Representations for Nonlinear Meta-Learning

Constructing good representations is critical for learning complex tasks...
research
06/10/2019

Few-Shot Learning with Per-Sample Rich Supervision

Learning with few samples is a major challenge for parameter-rich models...

Please sign up or login with your details

Forgot password? Click here to reset