Soft Curriculum for Learning Conditional GANs with Noisy-Labeled and Uncurated Unlabeled Data

07/17/2023
by   Kai Katsumata, et al.
0

Label-noise or curated unlabeled data is used to compensate for the assumption of clean labeled data in training the conditional generative adversarial network; however, satisfying such an extended assumption is occasionally laborious or impractical. As a step towards generative modeling accessible to everyone, we introduce a novel conditional image generation framework that accepts noisy-labeled and uncurated unlabeled data during training: (i) closed-set and open-set label noise in labeled data and (ii) closed-set and open-set unlabeled data. To combat it, we propose soft curriculum learning, which assigns instance-wise weights for adversarial training while assigning new labels for unlabeled data and correcting wrong labels for labeled data. Unlike popular curriculum learning, which uses a threshold to pick the training samples, our soft curriculum controls the effect of each training instance by using the weights predicted by the auxiliary classifier, resulting in the preservation of useful samples while ignoring harmful ones. Our experiments show that our approach outperforms existing semi-supervised and label-noise robust methods in terms of both quantitative and qualitative performance. In particular, the proposed approach is able to match the performance of (semi-) supervised GANs even with less than half the labeled data.

READ FULL TEXT

page 1

page 6

page 7

page 8

research
04/29/2022

OSSGAN: Open-Set Semi-Supervised Image Generation

We introduce a challenging training scheme of conditional GANs, called o...
research
03/08/2021

A Novel Perspective for Positive-Unlabeled Learning via Noisy Labels

Positive-unlabeled learning refers to the process of training a binary c...
research
07/09/2023

Score-based Conditional Generation with Fewer Labeled Data by Self-calibrating Classifier Guidance

Score-based Generative Models (SGMs) are a popular family of deep genera...
research
06/14/2020

Classify and Generate Reciprocally: Simultaneous Positive-Unlabelled Learning and Conditional Generation with Extra Data

The scarcity of class-labeled data is a ubiquitous bottleneck in a wide ...
research
05/01/2021

RATT: Leveraging Unlabeled Data to Guarantee Generalization

To assess generalization, machine learning scientists typically either (...
research
11/04/2016

RenderGAN: Generating Realistic Labeled Data

Deep Convolutional Neuronal Networks (DCNNs) are showing remarkable perf...
research
01/07/2022

Improved Input Reprogramming for GAN Conditioning

We study the GAN conditioning problem, whose goal is to convert a pretra...

Please sign up or login with your details

Forgot password? Click here to reset