Incremental Embedding Learning via Zero-Shot Translation

12/31/2020
by   Kun Wei, et al.
0

Modern deep learning methods have achieved great success in machine learning and computer vision fields by learning a set of pre-defined datasets. Howerver, these methods perform unsatisfactorily when applied into real-world situations. The reason of this phenomenon is that learning new tasks leads the trained model quickly forget the knowledge of old tasks, which is referred to as catastrophic forgetting. Current state-of-the-art incremental learning methods tackle catastrophic forgetting problem in traditional classification networks and ignore the problem existing in embedding networks, which are the basic networks for image retrieval, face recognition, zero-shot learning, etc. Different from traditional incremental classification networks, the semantic gap between the embedding spaces of two adjacent tasks is the main challenge for embedding networks under incremental learning setting. Thus, we propose a novel class-incremental method for embedding network, named as zero-shot translation class-incremental method (ZSTCI), which leverages zero-shot translation to estimate and compensate the semantic gap without any exemplars. Then, we try to learn a unified representation for two adjacent tasks in sequential learning process, which captures the relationships of previous classes and current classes precisely. In addition, ZSTCI can easily be combined with existing regularization-based incremental learning methods to further improve performance of embedding networks. We conduct extensive experiments on CUB-200-2011 and CIFAR100, and the experiment results prove the effectiveness of our method. The code of our method has been released.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2021

Rectification-based Knowledge Retention for Continual Learning

Deep learning models suffer from catastrophic forgetting when trained in...
research
04/01/2020

Semantic Drift Compensation for Class-Incremental Learning

Class-incremental learning of deep networks sequentially increases the n...
research
01/22/2021

Generative Replay-based Continual Zero-Shot Learning

Zero-shot learning is a new paradigm to classify objects from classes th...
research
07/29/2022

Conservative Generator, Progressive Discriminator: Coordination of Adversaries in Few-shot Incremental Image Synthesis

The capacity to learn incrementally from an online stream of data is an ...
research
03/30/2020

Incremental Learning In Online Scenario

Modern deep learning approaches have achieved great success in many visi...
research
04/17/2023

A Survey on Few-Shot Class-Incremental Learning

Large deep learning models are impressive, but they struggle when real-t...
research
07/30/2022

Few-Shot Class-Incremental Learning from an Open-Set Perspective

The continual appearance of new objects in the visual world poses consid...

Please sign up or login with your details

Forgot password? Click here to reset