DeepAI AI Chat
Log In Sign Up

Wormhole MAML: Meta-Learning in Glued Parameter Space

12/28/2022
by   Chih-Jung Tracy Chang, et al.
Stanford University
0

In this paper, we introduce a novel variation of model-agnostic meta-learning, where an extra multiplicative parameter is introduced in the inner-loop adaptation. Our variation creates a shortcut in the parameter space for the inner-loop adaptation and increases model expressivity in a highly controllable manner. We show both theoretically and numerically that our variation alleviates the problem of conflicting gradients and improves training dynamics. We conduct experiments on 3 distinctive problems, including a toy classification problem for threshold comparison, a regression problem for wavelet transform, and a classification problem on MNIST. We also discuss ways to generalize our method to a broader class of problems.

READ FULL TEXT
02/07/2021

Meta-Learning with Neural Tangent Kernels

Model Agnostic Meta-Learning (MAML) has emerged as a standard framework ...
10/31/2020

Combining Domain-Specific Meta-Learners in the Parameter Space for Cross-Domain Few-Shot Classification

The goal of few-shot classification is to learn a model that can classif...
04/13/2020

Regularizing Meta-Learning via Gradient Dropout

With the growing attention on learning-to-learn new tasks using only a f...
09/03/2020

Equal partners do better in defensive alliances

Cyclic dominance offers not just a way to maintain biodiversity, but als...
06/15/2022

On Enforcing Better Conditioned Meta-Learning for Rapid Few-Shot Adaptation

Inspired by the concept of preconditioning, we propose a novel method to...
08/24/2021

Adaptation-Agnostic Meta-Training

Many meta-learning algorithms can be formulated into an interleaved proc...