-
Auxiliary Deep Generative Models
Deep generative models parameterized by neural networks have recently ac...
read it
-
Semi-Unsupervised Learning with Deep Generative Models: Clustering and Classifying using Ultra-Sparse Labels
We introduce semi-unsupervised learning, an extreme case of semi-supervi...
read it
-
Variational Sequential Labelers for Semi-Supervised Learning
We introduce a family of multitask variational methods for semi-supervis...
read it
-
Semi-Supervised Generative Modeling for Controllable Speech Synthesis
We present a novel generative model that combines state-of-the-art neura...
read it
-
Guiding InfoGAN with Semi-Supervision
In this paper we propose a new semi-supervised GAN architecture (ss-Info...
read it
-
Revisiting Reweighted Wake-Sleep
Discrete latent-variable models, while applicable in a variety of settin...
read it
-
Semi-supervised learning with Bidirectional GANs
In this work we introduce a novel approach to train Bidirectional Genera...
read it
Semi-supervised Sequential Generative Models
We introduce a novel objective for training deep generative time-series models with discrete latent variables for which supervision is only sparsely available. This instance of semi-supervised learning is challenging for existing methods, because the exponential number of possible discrete latent configurations results in high variance gradient estimators. We first overcome this problem by extending the standard semi-supervised generative modeling objective with reweighted wake-sleep. However, we find that this approach still suffers when the frequency of available labels varies between training sequences. Finally, we introduce a unified objective inspired by teacher-forcing and show that this approach is robust to variable length supervision. We call the resulting method caffeinated wake-sleep (CWS) to emphasize its additional dependence on real data. We demonstrate its effectiveness with experiments on MNIST, handwriting, and fruit fly trajectory data.
READ FULL TEXT
Comments
There are no comments yet.