Incremental Class Learning using Variational Autoencoders with Similarity Learning

10/04/2021
by   Jiahao Huo, et al.
0

Catastrophic forgetting in neural networks during incremental learning remains a challenging problem. Previous research investigated catastrophic forgetting in fully connected networks, with some earlier work exploring activation functions and learning algorithms. Applications of neural networks have been extended to include similarity and metric learning. It is of significant interest to understand how metric learning loss functions would be affected by catastrophic forgetting. Our research investigates catastrophic forgetting for four well-known metric-based loss functions during incremental class learning. The loss functions are angular, contrastive, centre, and triplet loss. Our results show that the rate of catastrophic forgetting is different across loss functions on multiple datasets. The angular loss was least affected, followed by contrastive, triplet loss, and centre loss with good mining techniques. We implemented three existing incremental learning techniques, iCARL, EWC, and EBLL. We further proposed our novel technique using VAEs to generate representation as exemplars that are passed through intermediate layers of the network. Our method outperformed the three existing techniques. We have shown that we do not require stored images as exemplars for incremental learning with similarity learning. The generated representations can help preserve regions of the embedding space used by prior knowledge so that new knowledge will not "overwrite" prior knowledge.

READ FULL TEXT

page 1

page 5

research
05/25/2019

Constellation Loss: Improving the efficiency of deep metric learning loss functions for optimal embedding

Metric learning has become an attractive field for research on the lates...
research
03/26/2022

Uncertainty-aware Contrastive Distillation for Incremental Semantic Segmentation

A fundamental and challenging problem in deep learning is catastrophic f...
research
08/24/2022

SCALE: Online Self-Supervised Lifelong Learning without Prior Knowledge

Unsupervised lifelong learning refers to the ability to learn over time ...
research
03/31/2020

A Comparison of Metric Learning Loss Functions for End-To-End Speaker Verification

Despite the growing popularity of metric learning approaches, very littl...
research
05/02/2017

A Strategy for an Uncompromising Incremental Learner

Multi-class supervised learning systems require the knowledge of the ent...
research
04/05/2020

Fisher Discriminant Triplet and Contrastive Losses for Training Siamese Networks

Siamese neural network is a very powerful architecture for both feature ...
research
07/31/2019

Overcoming Catastrophic Forgetting by Neuron-level Plasticity Control

To address the issue of catastrophic forgetting in neural networks, we p...

Please sign up or login with your details

Forgot password? Click here to reset