Adversarial Dropout for Supervised and Semi-supervised Learning

07/12/2017
by   Sungrae Park, et al.
0

Recently, the training with adversarial examples, which are generated by adding a small but worst-case perturbation on input examples, has been proved to improve generalization performance of neural networks. In contrast to the individually biased inputs to enhance the generality, this paper introduces adversarial dropout, which is a minimal set of dropouts that maximize the divergence between the outputs from the network with the dropouts and the training supervisions. The identified adversarial dropout are used to reconfigure the neural network to train, and we demonstrated that training on the reconfigured sub-network improves the generalization performance of supervised and semi-supervised learning tasks on MNIST and CIFAR-10. We analyzed the trained model to reason the performance improvement, and we found that adversarial dropout increases the sparsity of neural networks more than the standard dropout does.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2019

Adversarial Dropout for Recurrent Neural Networks

Successful application processing sequential data, such as text and spee...
research
05/09/2017

Generative Adversarial Trainer: Defense to Adversarial Perturbations with GAN

We propose a novel technique to make neural network robust to adversaria...
research
11/13/2019

Adversarial Transformations for Semi-Supervised Learning

We propose a Regularization framework based on Adversarial Transformatio...
research
11/26/2019

Semi-Supervised Learning for Text Classification by Layer Partitioning

Most recent neural semi-supervised learning algorithms rely on adding sm...
research
11/20/2017

Virtual Adversarial Ladder Networks For Semi-supervised Learning

Semi-supervised learning (SSL) partially circumvents the high cost of la...
research
12/16/2014

Learning with Pseudo-Ensembles

We formalize the notion of a pseudo-ensemble, a (possibly infinite) coll...
research
12/15/2016

Improving Neural Network Generalization by Combining Parallel Circuits with Dropout

In an attempt to solve the lengthy training times of neural networks, we...

Please sign up or login with your details

Forgot password? Click here to reset