Piggyback GAN: Efficient Lifelong Learning for Image Conditioned Generation

04/24/2021
by   Mengyao Zhai, et al.
0

Humans accumulate knowledge in a lifelong fashion. Modern deep neural networks, on the other hand, are susceptible to catastrophic forgetting: when adapted to perform new tasks, they often fail to preserve their performance on previously learned tasks. Given a sequence of tasks, a naive approach addressing catastrophic forgetting is to train a separate standalone model for each task, which scales the total number of parameters drastically without efficiently utilizing previous models. In contrast, we propose a parameter efficient framework, Piggyback GAN, which learns the current task by building a set of convolutional and deconvolutional filters that are factorized into filters of the models trained on previous tasks. For the current task, our model achieves high generation quality on par with a standalone model at a lower number of parameters. For previous tasks, our model can also preserve generation quality since the filters for previous tasks are not altered. We validate Piggyback GAN on various image-conditioned generation tasks across different domains, and provide qualitative and quantitative results to show that the proposed approach can address catastrophic forgetting effectively and efficiently.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 12

page 14

07/23/2019

Lifelong GAN: Continual Learning for Conditional Image Generation

Lifelong learning is challenging for deep neural networks due to their s...
06/11/2018

Meta Continual Learning

Using neural networks in practical settings would benefit from the abili...
12/19/2019

Overcoming Long-term Catastrophic Forgetting through Adversarial Neural Pruning and Synaptic Consolidation

Enabling a neural network to sequentially learn multiple tasks is of gre...
04/12/2019

Incremental multi-domain learning with network latent tensor factorization

The prominence of deep learning, large amount of annotated data and incr...
04/12/2018

Combating catastrophic forgetting with developmental compression

Generally intelligent agents exhibit successful behavior across problems...
06/13/2020

GAN Memory with No Forgetting

Seeking to address the fundamental issue of memory in lifelong learning,...
06/26/2020

Supermasks in Superposition

We present the Supermasks in Superposition (SupSup) model, capable of se...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.