Meta-Learning with Neural Tangent Kernels

02/07/2021
by   Yufan Zhou, et al.
0

Model Agnostic Meta-Learning (MAML) has emerged as a standard framework for meta-learning, where a meta-model is learned with the ability of fast adapting to new tasks. However, as a double-looped optimization problem, MAML needs to differentiate through the whole inner-loop optimization path for every outer-loop training step, which may lead to both computational inefficiency and sub-optimal solutions. In this paper, we generalize MAML to allow meta-learning to be defined in function spaces, and propose the first meta-learning paradigm in the Reproducing Kernel Hilbert Space (RKHS) induced by the meta-model's Neural Tangent Kernel (NTK). Within this paradigm, we introduce two meta-learning algorithms in the RKHS, which no longer need a sub-optimal iterative inner-loop adaptation as in the MAML framework. We achieve this goal by 1) replacing the adaptation with a fast-adaptive regularizer in the RKHS; and 2) solving the adaptation analytically based on the NTK theory. Extensive experimental studies demonstrate advantages of our paradigm in both efficiency and quality of solutions compared to related meta-learning algorithms. Another interesting feature of our proposed methods is that they are demonstrated to be more robust to adversarial attacks and out-of-distribution adaptation than popular baselines, as demonstrated in our experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2021

Adaptation-Agnostic Meta-Training

Many meta-learning algorithms can be formulated into an interleaved proc...
research
12/28/2022

Wormhole MAML: Meta-Learning in Glued Parameter Space

In this paper, we introduce a novel variation of model-agnostic meta-lea...
research
04/19/2022

Metappearance: Meta-Learning for Visual Appearance Reproduction

There currently are two main approaches to reproducing visual appearance...
research
06/13/2022

Faster Optimization-Based Meta-Learning Adaptation Phase

Neural networks require a large amount of annotated data to learn. Meta-...
research
06/23/2020

On the Global Optimality of Model-Agnostic Meta-Learning

Model-agnostic meta-learning (MAML) formulates meta-learning as a bileve...
research
11/22/2022

A Recursively Recurrent Neural Network (R2N2) Architecture for Learning Iterative Algorithms

Meta-learning of numerical algorithms for a given task consist of the da...
research
07/18/2023

Exploiting Field Dependencies for Learning on Categorical Data

Traditional approaches for learning on categorical data underexploit the...

Please sign up or login with your details

Forgot password? Click here to reset