BiRT: Bio-inspired Replay in Vision Transformers for Continual Learning

05/08/2023
by   Kishaan Jeeveswaran, et al.
2

The ability of deep neural networks to continually learn and adapt to a sequence of tasks has remained challenging due to catastrophic forgetting of previously learned tasks. Humans, on the other hand, have a remarkable ability to acquire, assimilate, and transfer knowledge across tasks throughout their lifetime without catastrophic forgetting. The versatility of the brain can be attributed to the rehearsal of abstract experiences through a complementary learning system. However, representation rehearsal in vision transformers lacks diversity, resulting in overfitting and consequently, performance drops significantly compared to raw image rehearsal. Therefore, we propose BiRT, a novel representation rehearsal-based continual learning approach using vision transformers. Specifically, we introduce constructive noises at various stages of the vision transformer and enforce consistency in predictions with respect to an exponential moving average of the working model. Our method provides consistent performance gain over raw image and vanilla representation rehearsal on several challenging CL benchmarks, while being memory efficient and robust to natural and adversarial corruptions.

READ FULL TEXT

page 7

page 8

page 14

page 18

page 19

research
01/13/2022

Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners

In the SSLAD-Track 3B challenge on continual learning, we propose the me...
research
07/01/2020

Continual Learning: Tackling Catastrophic Forgetting in Deep Neural Networks with Replay Processes

Humans learn all their life long. They accumulate knowledge from a seque...
research
04/30/2020

Importance Driven Continual Learning for Segmentation Across Domains

The ability of neural networks to continuously learn and adapt to new ta...
research
06/28/2022

Continual Learning with Transformers for Image Classification

In many real-world scenarios, data to train machine learning models beco...
research
07/26/2022

S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning

State-of-the-art deep neural networks are still struggling to address th...
research
10/09/2021

Cognitively Inspired Learning of Incremental Drifting Concepts

Humans continually expand their learned knowledge to new domains and lea...
research
04/28/2023

Representation Matters: The Game of Chess Poses a Challenge to Vision Transformers

While transformers have gained the reputation as the "Swiss army knife o...

Please sign up or login with your details

Forgot password? Click here to reset