DeepAI AI Chat
Log In Sign Up

Wormhole MAML: Meta-Learning in Glued Parameter Space

by   Chih-Jung Tracy Chang, et al.
Stanford University

In this paper, we introduce a novel variation of model-agnostic meta-learning, where an extra multiplicative parameter is introduced in the inner-loop adaptation. Our variation creates a shortcut in the parameter space for the inner-loop adaptation and increases model expressivity in a highly controllable manner. We show both theoretically and numerically that our variation alleviates the problem of conflicting gradients and improves training dynamics. We conduct experiments on 3 distinctive problems, including a toy classification problem for threshold comparison, a regression problem for wavelet transform, and a classification problem on MNIST. We also discuss ways to generalize our method to a broader class of problems.


Meta-Learning with Neural Tangent Kernels

Model Agnostic Meta-Learning (MAML) has emerged as a standard framework ...

Combining Domain-Specific Meta-Learners in the Parameter Space for Cross-Domain Few-Shot Classification

The goal of few-shot classification is to learn a model that can classif...

Regularizing Meta-Learning via Gradient Dropout

With the growing attention on learning-to-learn new tasks using only a f...

Equal partners do better in defensive alliances

Cyclic dominance offers not just a way to maintain biodiversity, but als...

On Enforcing Better Conditioned Meta-Learning for Rapid Few-Shot Adaptation

Inspired by the concept of preconditioning, we propose a novel method to...

Adaptation-Agnostic Meta-Training

Many meta-learning algorithms can be formulated into an interleaved proc...