HyperMAML: Few-Shot Adaptation of Deep Models with Hypernetworks

05/31/2022
by   M. Przewięźlikowski, et al.
0

The aim of Few-Shot learning methods is to train models which can easily adapt to previously unseen tasks, based on small amounts of data. One of the most popular and elegant Few-Shot learning approaches is Model-Agnostic Meta-Learning (MAML). The main idea behind this method is to learn the general weights of the meta-model, which are further adapted to specific problems in a small number of gradient steps. However, the model's main limitation lies in the fact that the update procedure is realized by gradient-based optimisation. In consequence, MAML cannot always modify weights to the essential level in one or even a few gradient iterations. On the other hand, using many gradient steps results in a complex and time-consuming optimization procedure, which is hard to train in practice, and may lead to overfitting. In this paper, we propose HyperMAML, a novel generalization of MAML, where the training of the update procedure is also part of the model. Namely, in HyperMAML, instead of updating the weights with gradient descent, we use for this purpose a trainable Hypernetwork. Consequently, in this framework, the model can generate significant updates whose range is not limited to a fixed number of gradient steps. Experiments show that HyperMAML consistently outperforms MAML and performs comparably to other state-of-the-art techniques in a number of standard Few-Shot learning benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2022

Hypernetwork approach to Bayesian MAML

The main goal of Few-Shot learning algorithms is to enable learning from...
research
05/21/2018

Meta-learning with differentiable closed-form solvers

Adapting deep networks to new concepts from few examples is extremely ch...
research
06/07/2018

Probabilistic Model-Agnostic Meta-Learning

Meta-learning for few-shot learning entails acquiring a prior over previ...
research
06/07/2022

Few-Shot Learning by Dimensionality Reduction in Gradient Space

We introduce SubGD, a novel few-shot learning method which is based on t...
research
03/16/2021

Repurposing Pretrained Models for Robust Out-of-domain Few-Shot Learning

Model-agnostic meta-learning (MAML) is a popular method for few-shot lea...
research
07/18/2017

One-Shot Learning in Discriminative Neural Networks

We consider the task of one-shot learning of visual categories. In this ...
research
03/21/2022

HyperShot: Few-Shot Learning by Kernel HyperNetworks

Few-shot models aim at making predictions using a minimal number of labe...

Please sign up or login with your details

Forgot password? Click here to reset