Improved Input Reprogramming for GAN Conditioning

01/07/2022
by   Tuan Dinh, et al.
0

We study the GAN conditioning problem, whose goal is to convert a pretrained unconditional GAN into a conditional GAN using labeled data. We first identify and analyze three approaches to this problem – conditional GAN training from scratch, fine-tuning, and input reprogramming. Our analysis reveals that when the amount of labeled data is small, input reprogramming performs the best. Motivated by real-world scenarios with scarce labeled data, we focus on the input reprogramming approach and carefully analyze the existing algorithm. After identifying a few critical issues of the previous input reprogramming approach, we propose a new algorithm called InRep+. Our algorithm InRep+ addresses the existing issues with the novel uses of invertible neural networks and Positive-Unlabeled (PU) learning. Via extensive experiments, we show that InRep+ outperforms all existing methods, particularly when label information is scarce, noisy, and/or imbalanced. For instance, for the task of conditioning a CIFAR10 GAN with 1 82.13, whereas the second-best method achieves 114.51.

READ FULL TEXT

page 9

page 10

research
12/01/2022

AUG-FedPrompt: Practical Few-shot Federated NLP with Data-augmented Prompts

Transformer-based pre-trained models have become the de-facto solution f...
research
06/09/2019

Semi-supervised Complex-valued GAN for Polarimetric SAR Image Classification

Polarimetric synthetic aperture radar (PolSAR) images are widely used in...
research
06/20/2020

wav2vec 2.0: A Framework for Self-Supervised Learning of Speech Representations

We show for the first time that learning powerful representations from s...
research
07/17/2023

Soft Curriculum for Learning Conditional GANs with Noisy-Labeled and Uncurated Unlabeled Data

Label-noise or curated unlabeled data is used to compensate for the assu...
research
04/03/2023

Few-shot Fine-tuning is All You Need for Source-free Domain Adaptation

Recently, source-free unsupervised domain adaptation (SFUDA) has emerged...
research
12/19/2022

Less is More: Parameter-Free Text Classification with Gzip

Deep neural networks (DNNs) are often used for text classification tasks...
research
07/03/2020

Balanced Symmetric Cross Entropy for Large Scale Imbalanced and Noisy Data

Deep convolution neural network has attracted many attentions in large-s...

Please sign up or login with your details

Forgot password? Click here to reset