Conditional Channel Gated Networks for Task-Aware Continual Learning

03/31/2020
by   Davide Abati, et al.
10

Convolutional Neural Networks experience catastrophic forgetting when optimized on a sequence of learning problems: as they meet the objective of the current training examples, their performance on previous tasks drops drastically. In this work, we introduce a novel framework to tackle this problem with conditional computation. We equip each convolutional layer with task-specific gating modules, selecting which filters to apply on the given input. This way, we achieve two appealing properties. Firstly, the execution patterns of the gates allow to identify and protect important filters, ensuring no loss in the performance of the model for previously learned tasks. Secondly, by using a sparsity objective, we can promote the selection of a limited set of kernels, allowing to retain sufficient model capacity to digest new tasks.Existing solutions require, at test time, awareness of the task to which each example belongs to. This knowledge, however, may not be available in many practical scenarios. Therefore, we additionally introduce a task classifier that predicts the task label of each example, to deal with settings in which a task oracle is not available. We validate our proposal on four continual learning datasets. Results show that our model consistently outperforms existing methods both in the presence and the absence of a task oracle. Notably, on Split SVHN and Imagenet-50 datasets, our model yields up to 23.98 and 17.42

READ FULL TEXT
research
10/11/2022

Toward Sustainable Continual Learning: Detection and Knowledge Repurposing of Similar Tasks

Most existing works on continual learning (CL) focus on overcoming the c...
research
02/25/2019

ORACLE: Order Robust Adaptive Continual LEarning

The order of the tasks a continual learning model encounters may have la...
research
06/06/2019

Uncertainty-guided Continual Learning with Bayesian Neural Networks

Continual learning aims to learn new tasks without forgetting previously...
research
07/18/2023

HAT-CL: A Hard-Attention-to-the-Task PyTorch Library for Continual Learning

Catastrophic forgetting, the phenomenon in which a neural network loses ...
research
07/28/2021

Task-Specific Normalization for Continual Learning of Blind Image Quality Models

The computational vision community has recently paid attention to contin...
research
09/03/2020

Compression-aware Continual Learning using Singular Value Decomposition

We propose a compression based continual task learning method that can d...

Please sign up or login with your details

Forgot password? Click here to reset