CaSpeR: Latent Spectral Regularization for Continual Learning

01/09/2023
by   Emanuele Frascaroli, et al.
0

While biological intelligence grows organically as new knowledge is gathered throughout life, Artificial Neural Networks forget catastrophically whenever they face a changing training data distribution. Rehearsal-based Continual Learning (CL) approaches have been established as a versatile and reliable solution to overcome this limitation; however, sudden input disruptions and memory constraints are known to alter the consistency of their predictions. We study this phenomenon by investigating the geometric characteristics of the learner's latent space and find that replayed data points of different classes increasingly mix up, interfering with classification. Hence, we propose a geometric regularizer that enforces weak requirements on the Laplacian spectrum of the latent space, promoting a partitioning behavior. We show that our proposal, called Continual Spectral Regularizer (CaSpeR), can be easily combined with any rehearsal-based CL approach and improves the performance of SOTA methods on standard benchmarks. Finally, we conduct additional analysis to provide insights into CaSpeR's effects and applicability.

READ FULL TEXT
research
06/23/2021

Multiband VAE: Latent Space Partitioning for Knowledge Consolidation in Continual Learning

We propose a new method for unsupervised continual knowledge consolidati...
research
11/26/2021

Latent Space based Memory Replay for Continual Learning in Artificial Neural Networks

Memory replay may be key to learning in biological brains, which manage ...
research
03/10/2021

Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations

Deep neural networks suffer from the major limitation of catastrophic fo...
research
03/20/2023

Sparse Distributed Memory is a Continual Learner

Continual learning is a problem for artificial neural networks that thei...
research
05/27/2021

Encoders and Ensembles for Task-Free Continual Learning

We present an architecture that is effective for continual learning in a...
research
06/03/2019

Continual learning with hypernetworks

Artificial neural networks suffer from catastrophic forgetting when they...
research
02/27/2019

Continual Learning with Tiny Episodic Memories

Learning with less supervision is a major challenge in artificial intell...

Please sign up or login with your details

Forgot password? Click here to reset