Neural Rendering Model: Joint Generation and Prediction for Semi-Supervised Learning

11/01/2018
by   Nhat Ho, et al.
0

Unsupervised and semi-supervised learning are important problems that are especially challenging with complex data like natural images. Progress on these problems would accelerate if we had access to appropriate generative models under which to pose the associated inference tasks. Inspired by the success of Convolutional Neural Networks (CNNs) for supervised prediction in images, we design the Neural Rendering Model (NRM), a new probabilistic generative model whose inference calculations correspond to those in a given CNN architecture. The NRM uses the given CNN to design the prior distribution in the probabilistic model. Furthermore, the NRM generates images from coarse to finer scales. It introduces a small set of latent variables at each level, and enforces dependencies among all the latent variables via a conjugate prior distribution. This conjugate prior yields a new regularizer based on paths rendered in the generative model for training CNNs-the Rendering Path Normalization (RPN). We demonstrate that this regularizer improves generalization, both in theory and in practice. In addition, likelihood estimation in the NRM yields training losses for CNNs, and inspired by this, we design a new loss termed as the Max-Min cross entropy which outperforms the traditional cross-entropy loss for object classification. The Max-Min cross entropy suggests a new deep network architecture, namely the Max-Min network, which can learn from less labeled data while maintaining good prediction performance. Our experiments demonstrate that the NRM with the RPN and the Max-Min architecture exceeds or matches the-state-of-art on benchmarks including SVHN, CIFAR10, and CIFAR100 for semi-supervised and supervised learning tasks.

READ FULL TEXT

page 3

page 4

page 11

page 18

research
08/09/2013

Accuracy of Latent-Variable Estimation in Bayesian Semi-Supervised Learning

Hierarchical probabilistic models, such as Gaussian mixture models, are ...
research
06/09/2015

Learning to Linearize Under Uncertainty

Training deep feature hierarchies to solve supervised learning tasks has...
research
12/06/2016

Semi-Supervised Learning with the Deep Rendering Mixture Model

Semi-supervised learning algorithms reduce the high cost of acquiring la...
research
12/06/2016

A Probabilistic Framework for Deep Learning

We develop a probabilistic framework for deep learning based on the Deep...
research
08/02/2018

Likelihood-free inference with an improved cross-entropy estimator

We extend recent work (Brehmer, et. al., 2018) that use neural networks ...
research
11/02/2020

Semi-supervised Autoencoding Projective Dependency Parsing

We describe two end-to-end autoencoding models for semi-supervised graph...
research
01/24/2020

Simple and Effective Prevention of Mode Collapse in Deep One-Class Classification

Anomaly detection algorithms find extensive use in various fields. This ...

Please sign up or login with your details

Forgot password? Click here to reset