Generalising via Meta-Examples for Continual Learning in the Wild

01/28/2021
by   Alessia Bertugli, et al.
4

Learning quickly and continually is still an ambitious task for neural networks. Indeed, many real-world applications do not reflect the learning setting where neural networks shine, as data are usually few, mostly unlabelled and come as a stream. To narrow this gap, we introduce FUSION - Few-shot UnSupervIsed cONtinual learning - a novel strategy which aims to deal with neural networks that "learn in the wild", simulating a real distribution and flow of unbalanced tasks. We equip FUSION with MEML - Meta-Example Meta-Learning - a new module that simultaneously alleviates catastrophic forgetting and favours the generalisation and future learning of new tasks. To encourage features reuse during the meta-optimisation, our model exploits a single inner loop per task, taking advantage of an aggregated representation achieved through the use of a self-attention mechanism. To further enhance the generalisation capability of MEML, we extend it by adopting a technique that creates various augmented tasks and optimises over the hardest. Experimental results on few-shot learning benchmarks show that our model exceeds the other baselines in both FUSION and fully supervised case. We also explore how it behaves in standard continual learning consistently outperforming state-of-the-art approaches.

READ FULL TEXT

page 2

page 7

page 13

page 14

research
09/17/2020

Few-Shot Unsupervised Continual Learning through Meta-Examples

In real-world applications, data do not reflect the ones commonly used f...
research
04/19/2021

Few-shot Continual Learning: a Brain-inspired Approach

It is an important yet challenging setting to continually learn new task...
research
01/28/2021

Self-Attention Meta-Learner for Continual Learning

Continual learning aims to provide intelligent agents capable of learnin...
research
08/05/2020

Meta Continual Learning via Dynamic Programming

Meta-continual learning algorithms seek to rapidly train a model when fa...
research
11/12/2019

Learning from the Past: Continual Meta-Learning via Bayesian Graph Modeling

Meta-learning for few-shot learning allows a machine to leverage previou...
research
06/16/2021

SPeCiaL: Self-Supervised Pretraining for Continual Learning

This paper presents SPeCiaL: a method for unsupervised pretraining of re...
research
03/02/2017

Meta Networks

Neural networks have been successfully applied in applications with a la...

Please sign up or login with your details

Forgot password? Click here to reset