Subspace Distillation for Continual Learning

07/31/2023
by   Kaushik Roy, et al.
0

An ultimate objective in continual learning is to preserve knowledge learned in preceding tasks while learning new tasks. To mitigate forgetting prior knowledge, we propose a novel knowledge distillation technique that takes into the account the manifold structure of the latent/output space of a neural network in learning novel tasks. To achieve this, we propose to approximate the data manifold up-to its first order, hence benefiting from linear subspaces to model the structure and maintain the knowledge of a neural network while learning novel concepts. We demonstrate that the modeling with subspaces provides several intriguing properties, including robustness to noise and therefore effective for mitigating Catastrophic Forgetting in continual learning. We also discuss and show how our proposed method can be adopted to address both classification and segmentation problems. Empirically, we observe that our proposed method outperforms various continual learning methods on several challenging datasets including Pascal VOC, and Tiny-Imagenet. Furthermore, we show how the proposed method can be seamlessly combined with existing learning approaches to improve their performances. The codes of this article will be available at https://github.com/csiro-robotics/SDCL.

READ FULL TEXT

page 2

page 4

page 17

research
07/30/2020

Bilevel Continual Learning

Continual learning aims to learn continuously from a stream of tasks and...
research
07/03/2021

Split-and-Bridge: Adaptable Class Incremental Learning within a Single Neural Network

Continual learning has been a major problem in the deep learning communi...
research
12/17/2019

Direction Concentration Learning: Enhancing Congruency in Machine Learning

One of the well-known challenges in computer vision tasks is the visual ...
research
06/14/2023

Heterogeneous Continual Learning

We propose a novel framework and a solution to tackle the continual lear...
research
07/13/2020

RATT: Recurrent Attention to Transient Tasks for Continual Image Captioning

Research on continual learning has led to a variety of approaches to mit...
research
09/20/2023

Create and Find Flatness: Building Flat Training Spaces in Advance for Continual Learning

Catastrophic forgetting remains a critical challenge in the field of con...
research
12/10/2020

Overcoming Catastrophic Forgetting in Graph Neural Networks

Catastrophic forgetting refers to the tendency that a neural network "fo...

Please sign up or login with your details

Forgot password? Click here to reset