On Learning the Geodesic Path for Incremental Learning

04/17/2021
by   Christian Simon, et al.
17

Neural networks notoriously suffer from the problem of catastrophic forgetting, the phenomenon of forgetting the past knowledge when acquiring new knowledge. Overcoming catastrophic forgetting is of significant importance to emulate the process of "incremental learning", where the model is capable of learning from sequential experience in an efficient and robust way. State-of-the-art techniques for incremental learning make use of knowledge distillation towards preventing catastrophic forgetting. Therein, one updates the network while ensuring that the network's responses to previously seen concepts remain stable throughout updates. This in practice is done by minimizing the dissimilarity between current and previous responses of the network one way or another. Our work contributes a novel method to the arsenal of distillation techniques. In contrast to the previous state of the art, we propose to firstly construct low-dimensional manifolds for previous and current responses and minimize the dissimilarity between the responses along the geodesic connecting the manifolds. This induces a more formidable knowledge distillation with smooth properties which preserves the past knowledge more efficiently as observed by our comprehensive empirical study.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2022

Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation

Traditional object detectors are ill-equipped for incremental learning. ...
research
04/02/2022

Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

We present a novel class incremental learning approach based on deep neu...
research
03/06/2021

Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning

Few-shot class incremental learning (FSCIL) portrays the problem of lear...
research
06/03/2019

Random Path Selection for Incremental Learning

Incremental life-long learning is a main challenge towards the long-stan...
research
07/08/2018

Distillation Techniques for Pseudo-rehearsal Based Incremental Learning

The ability to learn from incrementally arriving data is essential for a...
research
05/20/2019

A comprehensive, application-oriented study of catastrophic forgetting in DNNs

We present a large-scale empirical study of catastrophic forgetting (CF)...
research
07/08/2018

Revisiting Distillation and Incremental Classifier Learning

One of the key differences between the learning mechanism of humans and ...

Please sign up or login with your details

Forgot password? Click here to reset