DeepAI
Log In Sign Up

Meta-Learning with Neural Tangent Kernels

02/07/2021
by   Yufan Zhou, et al.
0

Model Agnostic Meta-Learning (MAML) has emerged as a standard framework for meta-learning, where a meta-model is learned with the ability of fast adapting to new tasks. However, as a double-looped optimization problem, MAML needs to differentiate through the whole inner-loop optimization path for every outer-loop training step, which may lead to both computational inefficiency and sub-optimal solutions. In this paper, we generalize MAML to allow meta-learning to be defined in function spaces, and propose the first meta-learning paradigm in the Reproducing Kernel Hilbert Space (RKHS) induced by the meta-model's Neural Tangent Kernel (NTK). Within this paradigm, we introduce two meta-learning algorithms in the RKHS, which no longer need a sub-optimal iterative inner-loop adaptation as in the MAML framework. We achieve this goal by 1) replacing the adaptation with a fast-adaptive regularizer in the RKHS; and 2) solving the adaptation analytically based on the NTK theory. Extensive experimental studies demonstrate advantages of our paradigm in both efficiency and quality of solutions compared to related meta-learning algorithms. Another interesting feature of our proposed methods is that they are demonstrated to be more robust to adversarial attacks and out-of-distribution adaptation than popular baselines, as demonstrated in our experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/24/2021

Adaptation-Agnostic Meta-Training

Many meta-learning algorithms can be formulated into an interleaved proc...
12/28/2022

Wormhole MAML: Meta-Learning in Glued Parameter Space

In this paper, we introduce a novel variation of model-agnostic meta-lea...
04/19/2022

Metappearance: Meta-Learning for Visual Appearance Reproduction

There currently are two main approaches to reproducing visual appearance...
06/13/2022

Faster Optimization-Based Meta-Learning Adaptation Phase

Neural networks require a large amount of annotated data to learn. Meta-...
02/01/2021

Meta-learning with negative learning rates

Deep learning models require a large amount of data to perform well. Whe...
06/23/2020

On the Global Optimality of Model-Agnostic Meta-Learning

Model-agnostic meta-learning (MAML) formulates meta-learning as a bileve...
09/08/2021

Do What Nature Did To Us: Evolving Plastic Recurrent Neural Networks For Task Generalization

While artificial neural networks (ANNs) have been widely adopted in mach...