Multiband VAE: Latent Space Partitioning for Knowledge Consolidation in Continual Learning

06/23/2021
by   Kamil Deja, et al.
0

We propose a new method for unsupervised continual knowledge consolidation in generative models that relies on the partitioning of Variational Autoencoder's latent space. Acquiring knowledge about new data samples without forgetting previous ones is a critical problem of continual learning. Currently proposed methods achieve this goal by extending the existing model while constraining its behavior not to degrade on the past data, which does not exploit the full potential of relations within the entire training dataset. In this work, we identify this limitation and posit the goal of continual learning as a knowledge accumulation task. We solve it by continuously re-aligning latent space partitions that we call bands which are representations of samples seen in different tasks, driven by the similarity of the information they contain. In addition, we introduce a simple yet effective method for controlled forgetting of past data that improves the quality of reconstructions encoded in latent bands and a latent space disentanglement technique that improves knowledge consolidation. On top of the standard continual learning evaluation benchmarks, we evaluate our method on a new knowledge consolidation scenario and show that the proposed approach outperforms state-of-the-art by up to twofold across all testing scenarios.

READ FULL TEXT
research
03/10/2021

Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations

Deep neural networks suffer from the major limitation of catastrophic fo...
research
01/09/2023

CaSpeR: Latent Spectral Regularization for Continual Learning

While biological intelligence grows organically as new knowledge is gath...
research
09/18/2023

Looking through the past: better knowledge retention for generative replay in continual learning

In this work, we improve the generative replay in a continual learning s...
research
03/22/2023

Encoding Binary Concepts in the Latent Space of Generative Models for Enhancing Data Representation

Binary concepts are empirically used by humans to generalize efficiently...
research
05/28/2018

Keep and Learn: Continual Learning by Constraining the Latent Space for Knowledge Preservation in Neural Networks

Data is one of the most important factors in machine learning. However, ...
research
06/02/2020

Continual Learning of Predictive Models in Video Sequences via Variational Autoencoders

This paper proposes a method for performing continual learning of predic...
research
03/28/2022

Gradient-Matching Coresets for Rehearsal-Based Continual Learning

The goal of continual learning (CL) is to efficiently update a machine l...

Please sign up or login with your details

Forgot password? Click here to reset