ZS-IL: Looking Back on Learned ExperiencesFor Zero-Shot Incremental Learning

03/22/2021
by   Mozhgan PourKeshavarz, et al.
7

Classical deep neural networks are limited in their ability to learn from emerging streams of training data. When trained sequentially on new or evolving tasks, their performance degrades sharply, making them inappropriate in real-world use cases. Existing methods tackle it by either storing old data samples or only updating a parameter set of DNNs, which, however, demands a large memory budget or spoils the flexibility of models to learn the incremented class distribution. In this paper, we shed light on an on-call transfer set to provide past experiences whenever a new class arises in the data stream. In particular, we propose a Zero-Shot Incremental Learning not only to replay past experiences the model has learned but also to perform this in a zero-shot manner. Towards this end, we introduced a memory recovery paradigm in which we query the network to synthesize past exemplars whenever a new task (class) emerges. Thus, our method needs no fixed-sized memory, besides calls the proposed memory recovery paradigm to provide past exemplars, named a transfer set in order to mitigate catastrophically forgetting the former classes. Moreover, in contrast with recently proposed methods, the suggested paradigm does not desire a parallel architecture since it only relies on the learner network. Compared to the state-of-the-art data techniques without buffering past data samples, ZS-IL demonstrates significantly better performance on the well-known datasets (CIFAR-10, Tiny-ImageNet) in both Task-IL and Class-IL settings.

READ FULL TEXT

page 3

page 6

page 7

research
03/30/2021

Rectification-based Knowledge Retention for Continual Learning

Deep learning models suffer from catastrophic forgetting when trained in...
research
05/23/2022

Rethinking Task-Incremental Learning Baselines

It is common to have continuous streams of new data that need to be intr...
research
01/22/2021

Generative Replay-based Continual Zero-Shot Learning

Zero-shot learning is a new paradigm to classify objects from classes th...
research
02/18/2021

Essentials for Class Incremental Learning

Contemporary neural networks are limited in their ability to learn from ...
research
10/16/2021

Dataset Knowledge Transfer for Class-Incremental Learning without Memory

Incremental learning enables artificial agents to learn from sequential ...
research
09/10/2021

Saliency Guided Experience Packing for Replay in Continual Learning

Artificial learning systems aspire to mimic human intelligence by contin...
research
07/29/2022

Conservative Generator, Progressive Discriminator: Coordination of Adversaries in Few-shot Incremental Image Synthesis

The capacity to learn incrementally from an online stream of data is an ...

Please sign up or login with your details

Forgot password? Click here to reset