Lifelong GAN: Continual Learning for Conditional Image Generation

07/23/2019
by   Mengyao Zhai, et al.
7

Lifelong learning is challenging for deep neural networks due to their susceptibility to catastrophic forgetting. Catastrophic forgetting occurs when a trained network is not able to maintain its ability to accomplish previously learned tasks when it is trained to perform new tasks. We study the problem of lifelong learning for generative models, extending a trained network to new conditional generation tasks without forgetting previous tasks, while assuming access to the training data for the current task only. In contrast to state-of-the-art memory replay based approaches which are limited to label-conditioned image generation tasks, a more generic framework for continual learning of generative models under different conditional image generation settings is proposed in this paper. Lifelong GAN employs knowledge distillation to transfer learned knowledge from previous networks to the new network. This makes it possible to perform image-conditioned generation tasks in a lifelong learning setting. We validate Lifelong GAN for both image-conditioned and label-conditioned generation tasks, and provide qualitative and quantitative results to show the generality and effectiveness of our method.

READ FULL TEXT

page 4

page 6

page 7

page 8

research
04/24/2021

Piggyback GAN: Efficient Lifelong Learning for Image Conditioned Generation

Humans accumulate knowledge in a lifelong fashion. Modern deep neural ne...
research
09/04/2021

On robustness of generative representations against catastrophic forgetting

Catastrophic forgetting of previously learned knowledge while learning n...
research
06/13/2020

GAN Memory with No Forgetting

Seeking to address the fundamental issue of memory in lifelong learning,...
research
04/19/2021

Overcoming Catastrophic Forgetting with Gaussian Mixture Replay

We present Gaussian Mixture Replay (GMR), a rehearsal-based approach for...
research
11/27/2020

Association: Remind Your GAN not to Forget

Neural networks are susceptible to catastrophic forgetting. They fail to...
research
06/03/2019

Continual learning with hypernetworks

Artificial neural networks suffer from catastrophic forgetting when they...
research
04/06/2021

Hypothesis-driven Stream Learning with Augmented Memory

Stream learning refers to the ability to acquire and transfer knowledge ...

Please sign up or login with your details

Forgot password? Click here to reset