Exploring the Similarity of Representations in Model-Agnostic Meta-Learning

05/12/2021
by   Thomas Goerttler, et al.
0

In past years model-agnostic meta-learning (MAML) has been one of the most promising approaches in meta-learning. It can be applied to different kinds of problems, e.g., reinforcement learning, but also shows good results on few-shot learning tasks. Besides their tremendous success in these tasks, it has still not been fully revealed yet, why it works so well. Recent work proposes that MAML rather reuses features than rapidly learns. In this paper, we want to inspire a deeper understanding of this question by analyzing MAML's representation. We apply representation similarity analysis (RSA), a well-established method in neuroscience, to the few-shot learning instantiation of MAML. Although some part of our analysis supports their general results that feature reuse is predominant, we also reveal arguments against their conclusion. The similarity-increase of layers closer to the input layers arises from the learning task itself and not from the model. In addition, the representations after inner gradient steps make a broader change to the representation than the changes during meta-training.

READ FULL TEXT

page 4

page 8

research
05/17/2019

Alpha MAML: Adaptive Model-Agnostic Meta-Learning

Model-agnostic meta-learning (MAML) is a meta-learning technique to trai...
research
12/11/2018

Rethink and Redesign Meta learning

Recently, Meta-learning has been shown as a promising way to improve the...
research
06/07/2018

Probabilistic Model-Agnostic Meta-Learning

Meta-learning for few-shot learning entails acquiring a prior over previ...
research
08/20/2020

Does MAML really want feature reuse only?

Meta-learning, the effort to solve new tasks with only a few samples, ha...
research
12/24/2021

Does MAML Only Work via Feature Re-use? A Data Centric Perspective

Recent work has suggested that a good embedding is all we need to solve ...
research
12/05/2019

MetaFun: Meta-Learning with Iterative Functional Updates

Few-shot supervised learning leverages experience from previous learning...
research
05/18/2022

Meta-Learning Sparse Compression Networks

Recent work in Deep Learning has re-imagined the representation of data ...

Please sign up or login with your details

Forgot password? Click here to reset