MAC: A Meta-Learning Approach for Feature Learning and Recombination

09/20/2022
by   S. Tiwari, et al.
0

Optimization-based meta-learning aims to learn an initialization so that a new unseen task can be learned within a few gradient updates. Model Agnostic Meta-Learning (MAML) is a benchmark algorithm comprising two optimization loops. The inner loop is dedicated to learning a new task and the outer loop leads to meta-initialization. However, ANIL (almost no inner loop) algorithm shows that feature reuse is an alternative to rapid learning in MAML. Thus, the meta-initialization phase makes MAML primed for feature reuse and obviates the need for rapid learning. Contrary to ANIL, we hypothesize that there may be a need to learn new features during meta-testing. A new unseen task from non-similar distribution would necessitate rapid learning in addition reuse and recombination of existing features. In this paper, we invoke the width-depth duality of neural networks, wherein, we increase the width of the network by adding extra computational units (ACU). The ACUs enable the learning of new atomic features in the meta-testing task, and the associated increased width facilitates information propagation in the forwarding pass. The newly learnt features combine with existing features in the last layer for meta-learning. Experimental results show that our proposed MAC method outperformed existing ANIL algorithm for non-similar task distribution by approximately 13 task setting)

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2020

Does MAML really want feature reuse only?

Meta-learning, the effort to solve new tasks with only a few samples, ha...
research
09/19/2019

Rapid Learning or Feature Reuse? Towards Understanding the Effectiveness of MAML

An important research direction in machine learning has centered around ...
research
04/06/2023

Learning to Learn with Indispensable Connections

Meta-learning aims to solve unseen tasks with few labelled instances. Ne...
research
03/18/2022

Negative Inner-Loop Learning Rates Learn Universal Features

Model Agnostic Meta-Learning (MAML) consists of two optimization loops: ...
research
06/30/2021

How to Train Your MAML to Excel in Few-Shot Classification

Model-agnostic meta-learning (MAML) is arguably the most popular meta-le...
research
06/29/2021

MAML is a Noisy Contrastive Learner

Model-agnostic meta-learning (MAML) is one of the most popular and widel...
research
06/12/2020

Attentive Feature Reuse for Multi Task Meta learning

We develop new algorithms for simultaneous learning of multiple tasks (e...

Please sign up or login with your details

Forgot password? Click here to reset