DeepAI AI Chat
Log In Sign Up

OvA-INN: Continual Learning with Invertible Neural Networks

by   G. Hocquet, et al.
Paris-Sud University

In the field of Continual Learning, the objective is to learn several tasks one after the other without access to the data from previous tasks. Several solutions have been proposed to tackle this problem but they usually assume that the user knows which of the tasks to perform at test time on a particular sample, or rely on small samples from previous data and most of them suffer of a substantial drop in accuracy when updated with batches of only one class at a time. In this article, we propose a new method, OvA-INN, which is able to learn one class at a time and without storing any of the previous data. To achieve this, for each class, we train a specific Invertible Neural Network to extract the relevant features to compute the likelihood on this class. At test time, we can predict the class of a sample by identifying the network which predicted the highest likelihood. With this method, we show that we can take advantage of pretrained models by stacking an Invertible Network on top of a feature extractor. This way, we are able to outperform state-of-the-art approaches that rely on features learning for the Continual Learning of MNIST and CIFAR-100 datasets. In our experiments, we reach 72 our model one class at a time.


page 1

page 2

page 3

page 4


Bayesian Optimized Continual Learning with Attention Mechanism

Though neural networks have achieved much progress in various applicatio...

A Simple Baseline that Questions the Use of Pretrained-Models in Continual Learning

With the success of pretraining techniques in representation learning, a...

Efficient Continual Learning Ensembles in Neural Network Subspaces

A growing body of research in continual learning focuses on the catastro...

False Memory Formation in Continual Learners Through Imperceptible Backdoor Trigger

In this brief, we show that sequentially learning new information presen...

Keep and Learn: Continual Learning by Constraining the Latent Space for Knowledge Preservation in Neural Networks

Data is one of the most important factors in machine learning. However, ...

"Prompt-Gamma Neutron Activation Analysis (PGNAA)" Metal Spectral Classification using Deep Learning Method

There is a pressing market demand to minimize the test time of Prompt Ga...

A study on the plasticity of neural networks

One aim shared by multiple settings, such as continual learning or trans...