Few-shot Continual Learning: a Brain-inspired Approach

04/19/2021
by   Liyuan Wang, et al.
0

It is an important yet challenging setting to continually learn new tasks from a few examples. Although numerous efforts have been devoted to either continual learning or few-shot learning, little work has considered this new setting of few-shot continual learning (FSCL), which needs to minimize the catastrophic forgetting to the old tasks and gradually improve the ability of few-shot generalization. In this paper, we provide a first systematic study on FSCL and present an effective solution with deep neural networks. Our solution is based on the observation that continual learning of a task sequence inevitably interferes few-shot generalization, which makes it highly nontrivial to extend few-shot learning strategies to continual learning scenarios. We draw inspirations from the robust brain system and develop a method that (1) interdependently updates a pair of fast / slow weights for continual learning and few-shot learning to disentangle their divergent objectives, inspired by the biological model of meta-plasticity and fast / slow synapse; and (2) applies a brain-inspired two-step consolidation strategy to learn a task sequence without forgetting in the fast weights while improve generalization without overfitting in the slow weights. Extensive results on various benchmarks show that our method achieves a better performance than joint training of all the tasks ever seen. The ability of few-shot generalization is also substantially improved from incoming tasks and examples.

READ FULL TEXT

page 4

page 8

research
12/03/2018

Few-Shot Self Reminder to Overcome Catastrophic Forgetting

Deep neural networks are known to suffer the catastrophic forgetting pro...
research
07/29/2021

Few-Shot and Continual Learning with Attentive Independent Mechanisms

Deep neural networks (DNNs) are known to perform well when deployed to t...
research
01/28/2021

Generalising via Meta-Examples for Continual Learning in the Wild

Learning quickly and continually is still an ambitious task for neural n...
research
01/11/2023

Continual Few-Shot Learning Using HyperTransformers

We focus on the problem of learning without forgetting from multiple tas...
research
06/18/2023

IF2Net: Innately Forgetting-Free Networks for Continual Learning

Continual learning can incrementally absorb new concepts without interfe...
research
09/17/2020

Few-Shot Unsupervised Continual Learning through Meta-Examples

In real-world applications, data do not reflect the ones commonly used f...
research
08/30/2017

Continual One-Shot Learning of Hidden Spike-Patterns with Neural Network Simulation Expansion and STDP Convergence Predictions

This paper presents a constructive algorithm that achieves successful on...

Please sign up or login with your details

Forgot password? Click here to reset