-
On catastrophic forgetting and mode collapse in Generative Adversarial Networks
Generative Adversarial Networks (GAN) are one of the most prominent tool...
read it
-
Lifelong GAN: Continual Learning for Conditional Image Generation
Lifelong learning is challenging for deep neural networks due to their s...
read it
-
Memory Replay GANs: learning to generate images from new categories without forgetting
Previous works on sequential learning address the problem of forgetting ...
read it
-
Association: Remind Your GAN not to Forget
Neural networks are susceptible to catastrophic forgetting. They fail to...
read it
-
Incremental Knowledge Based Question Answering
In the past years, Knowledge-Based Question Answering (KBQA), which aims...
read it
-
Closed-Loop GAN for continual Learning
Sequential learning of tasks using gradient descent leads to an unremitt...
read it
-
Convolution Forgetting Curve Model for Repeated Learning
Most of mathematic forgetting curve models fit well with the forgetting ...
read it
GAN Memory with No Forgetting
Seeking to address the fundamental issue of memory in lifelong learning, we propose a GAN memory that is capable of realistically remembering a stream of generative processes with no forgetting. Our GAN memory is based on recognizing that one can modulate the “style” of a GAN model to form perceptually-distant targeted generation. Accordingly, we propose to do sequential style modulations atop a well-behaved base GAN model, to form sequential targeted generative models, while simultaneously benefiting from the transferred base knowledge. Experiments demonstrate the superiority of our method over existing approaches and its effectiveness in alleviating catastrophic forgetting for lifelong classification problems.
READ FULL TEXT
Comments
There are no comments yet.