Inception Score, Label Smoothing, Gradient Vanishing and -log(D(x)) Alternative

08/05/2017
by   Zhiming Zhou, et al.
0

In this paper, we study several GAN related topics mathematically, including Inception score, label smoothing, gradient vanishing and the -log(D(x)) alternative. We show that Inception score is actually equivalent to Mode score, both consisting of two entropy terms, which has the drawback of ignoring the prior distribution of the labels. We thus propose AM score as an alternative that leverages cross-entropy and takes the reference distribution into account. Empirical results indicate that AM score outperforms Inception score. We study label smoothing, gradient vanishing and -log(D(x)) alternative from the perspective of class-aware gradient, with which we show the exact problems when applying label smoothing to fake samples along with the log(1-D(x)) generator loss, which is previously unclear, and more importantly show that the problem does not exist when using the -log(D(x)) generator loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2020

Does label smoothing mitigate label noise?

Label smoothing is commonly used in training deep learning models, where...
research
05/18/2021

Label Inference Attacks from Log-loss Scores

Log-loss (also known as cross-entropy loss) metric is ubiquitously used ...
research
06/27/2019

Adversarial Robustness via Adversarial Label-Smoothing

We study Label-Smoothing as a means for improving adversarial robustness...
research
08/31/2021

Chi-square Loss for Softmax: an Echo of Neural Network Structure

Softmax working with cross-entropy is widely used in classification, whi...
research
05/02/2020

Generalized Entropy Regularization or: There's Nothing Special about Label Smoothing

Prior work has explored directly regularizing the output distributions o...
research
10/16/2018

Discriminator Rejection Sampling

We propose a rejection sampling scheme using the discriminator of a GAN ...

Please sign up or login with your details

Forgot password? Click here to reset