Piggyback GAN: Efficient Lifelong Learning for Image Conditioned Generation

by   Mengyao Zhai, et al.

Humans accumulate knowledge in a lifelong fashion. Modern deep neural networks, on the other hand, are susceptible to catastrophic forgetting: when adapted to perform new tasks, they often fail to preserve their performance on previously learned tasks. Given a sequence of tasks, a naive approach addressing catastrophic forgetting is to train a separate standalone model for each task, which scales the total number of parameters drastically without efficiently utilizing previous models. In contrast, we propose a parameter efficient framework, Piggyback GAN, which learns the current task by building a set of convolutional and deconvolutional filters that are factorized into filters of the models trained on previous tasks. For the current task, our model achieves high generation quality on par with a standalone model at a lower number of parameters. For previous tasks, our model can also preserve generation quality since the filters for previous tasks are not altered. We validate Piggyback GAN on various image-conditioned generation tasks across different domains, and provide qualitative and quantitative results to show that the proposed approach can address catastrophic forgetting effectively and efficiently.



There are no comments yet.


page 12

page 14


Lifelong GAN: Continual Learning for Conditional Image Generation

Lifelong learning is challenging for deep neural networks due to their s...

Meta Continual Learning

Using neural networks in practical settings would benefit from the abili...

Overcoming Long-term Catastrophic Forgetting through Adversarial Neural Pruning and Synaptic Consolidation

Enabling a neural network to sequentially learn multiple tasks is of gre...

Incremental multi-domain learning with network latent tensor factorization

The prominence of deep learning, large amount of annotated data and incr...

Combating catastrophic forgetting with developmental compression

Generally intelligent agents exhibit successful behavior across problems...

GAN Memory with No Forgetting

Seeking to address the fundamental issue of memory in lifelong learning,...

Supermasks in Superposition

We present the Supermasks in Superposition (SupSup) model, capable of se...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.