Towards Lifelong Self-Supervision For Unpaired Image-to-Image Translation

03/31/2020
by   Victor Schmidt, et al.
15

Unpaired Image-to-Image Translation (I2IT) tasks often suffer from lack of data, a problem which self-supervised learning (SSL) has recently been very popular and successful at tackling. Leveraging auxiliary tasks such as rotation prediction or generative colorization, SSL can produce better and more robust representations in a low data regime. Training such tasks along an I2IT task is however computationally intractable as model size and the number of task grow. On the other hand, learning sequentially could incur catastrophic forgetting of previously learned tasks. To alleviate this, we introduce Lifelong Self-Supervision (LiSS) as a way to pre-train an I2IT model (e.g., CycleGAN) on a set of self-supervised auxiliary tasks. By keeping an exponential moving average of past encoders and distilling the accumulated knowledge, we are able to maintain the network's validation performance on a number of tasks without any form of replay, parameter isolation or retraining techniques typically used in continual learning. We show that models trained with LiSS perform better on past tasks, while also being more robust than the CycleGAN baseline to color bias and entity entanglement (when two entities are very close).

READ FULL TEXT
research
06/28/2021

Co^2L: Contrastive Continual Learning

Recent breakthroughs in self-supervised learning show that such algorith...
research
03/30/2023

Practical self-supervised continual learning with continual fine-tuning

Self-supervised learning (SSL) has shown remarkable performance in compu...
research
07/13/2022

Task Agnostic Representation Consolidation: a Self-supervised based Continual Learning Approach

Continual learning (CL) over non-stationary data streams remains one of ...
research
12/08/2021

Self-Supervised Models are Continual Learners

Self-supervised models have been shown to produce comparable or better v...
research
10/27/2018

Self-Supervised GAN to Counter Forgetting

GANs involve training two networks in an adversarial game, where each ne...
research
03/16/2023

CSSL-MHTR: Continual Self-Supervised Learning for Scalable Multi-script Handwritten Text Recognition

Self-supervised learning has recently emerged as a strong alternative in...

Please sign up or login with your details

Forgot password? Click here to reset