Topological Continual Learning with Wasserstein Distance and Barycenter

10/06/2022
by   Tananun Songdechakraiwut, et al.
0

Continual learning in neural networks suffers from a phenomenon called catastrophic forgetting, in which a network quickly forgets what was learned in a previous task. The human brain, however, is able to continually learn new tasks and accumulate knowledge throughout life. Neuroscience findings suggest that continual learning success in the human brain is potentially associated with its modular structure and memory consolidation mechanisms. In this paper we propose a novel topological regularization that penalizes cycle structure in a neural network during training using principled theory from persistent homology and optimal transport. The penalty encourages the network to learn modular structure during training. The penalization is based on the closed-form expressions of the Wasserstein distance and barycenter for the topological features of a 1-skeleton representation for the network. Our topological continual learning method combines the proposed regularization with a tiny episodic memory to mitigate forgetting. We demonstrate that our method is effective in both shallow and deep network architectures for multiple image classification datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2019

Realizing Continual Learning through Modeling a Learning System as a Fiber Bundle

A human brain is capable of continual learning by nature; however the cu...
research
11/02/2020

Modular-Relatedness for Continual Learning

In this paper, we propose a continual learning (CL) technique that is be...
research
10/11/2022

Continual Learning by Modeling Intra-Class Variation

It has been observed that neural networks perform poorly when the data o...
research
11/30/2021

Hierarchical Prototype Networks for Continual Graph Representation Learning

Despite significant advances in graph representation learning, little at...
research
09/07/2018

HC-Net: Memory-based Incremental Dual-Network System for Continual learning

Training a neural network for a classification task typically assumes th...
research
03/12/2021

Training Networks in Null Space of Feature Covariance for Continual Learning

In the setting of continual learning, a network is trained on a sequence...
research
10/23/2020

A Combinatorial Perspective on Transfer Learning

Human intelligence is characterized not only by the capacity to learn co...

Please sign up or login with your details

Forgot password? Click here to reset