Meta-attention for ViT-backed Continual Learning

03/22/2022
by   Mengqi Xue, et al.
0

Continual learning is a longstanding research topic due to its crucial role in tackling continually arriving tasks. Up to now, the study of continual learning in computer vision is mainly restricted to convolutional neural networks (CNNs). However, recently there is a tendency that the newly emerging vision transformers (ViTs) are gradually dominating the field of computer vision, which leaves CNN-based continual learning lagging behind as they can suffer from severe performance degradation if straightforwardly applied to ViTs. In this paper, we study ViT-backed continual learning to strive for higher performance riding on recent advances of ViTs. Inspired by mask-based continual learning methods in CNNs, where a mask is learned per task to adapt the pre-trained ViT to the new task, we propose MEta-ATtention (MEAT), i.e., attention to self-attention, to adapt a pre-trained ViT to new tasks without sacrificing performance on already learned tasks. Unlike prior mask-based methods like Piggyback, where all parameters are associated with corresponding masks, MEAT leverages the characteristics of ViTs and only masks a portion of its parameters. It renders MEAT more efficient and effective with less overhead and higher accuracy. Extensive experiments demonstrate that MEAT exhibits significant superiority to its state-of-the-art CNN counterparts, with 4.0 6.0 absolute boosts in accuracy. Our code has been released at https://github.com/zju-vipa/MEAT-TIL.

READ FULL TEXT
research
07/11/2020

Batch-level Experience Replay with Review for Continual Learning

Continual learning is a branch of deep learning that seeks to strike a b...
research
10/01/2020

Meta-Consolidation for Continual Learning

The ability to continuously learn and adapt itself to new tasks, without...
research
06/18/2022

NISPA: Neuro-Inspired Stability-Plasticity Adaptation for Continual Learning in Sparse Networks

The goal of continual learning (CL) is to learn different tasks over tim...
research
09/23/2021

Recent Advances of Continual Learning in Computer Vision: An Overview

In contrast to batch learning where all training data is available at on...
research
02/28/2023

Adapter Incremental Continual Learning of Efficient Audio Spectrogram Transformers

Continual learning involves training neural networks incrementally for n...
research
09/06/2022

Continual Learning: Fast and Slow

According to the Complementary Learning Systems (CLS) theory <cit.> in n...
research
09/03/2022

Continual Learning for Steganalysis

To detect the existing steganographic algorithms, recent steganalysis me...

Please sign up or login with your details

Forgot password? Click here to reset