Wide-Residual-Network
Implementation of a Wide Residual Network on Tensorflow for Image Classification. Trained and tested on Cifar10 dataset.
view repo
In this paper, we study several GAN related topics mathematically, including Inception score, label smoothing, gradient vanishing and the -log(D(x)) alternative. We show that Inception score is actually equivalent to Mode score, both consisting of two entropy terms, which has the drawback of ignoring the prior distribution of the labels. We thus propose AM score as an alternative that leverages cross-entropy and takes the reference distribution into account. Empirical results indicate that AM score outperforms Inception score. We study label smoothing, gradient vanishing and -log(D(x)) alternative from the perspective of class-aware gradient, with which we show the exact problems when applying label smoothing to fake samples along with the log(1-D(x)) generator loss, which is previously unclear, and more importantly show that the problem does not exist when using the -log(D(x)) generator loss.
READ FULL TEXT
Label smoothing is commonly used in training deep learning models, where...
read it
We study Label-Smoothing as a means for improving adversarial robustness...
read it
Deep learning has been shown to achieve impressive results in several do...
read it
Prior work has explored directly regularizing the output distributions o...
read it
We propose a rejection sampling scheme using the discriminator of a GAN ...
read it
Modern deep learning is primarily an experimental science, in which empi...
read it
Deep learning has become the method of choice in many application domain...
read it
Implementation of a Wide Residual Network on Tensorflow for Image Classification. Trained and tested on Cifar10 dataset.
Comments
There are no comments yet.