Neural Simpletrons - Minimalistic Directed Generative Networks for Learning with Few Labels

06/28/2015
by   Dennis Forster, et al.
0

Classifiers for the semi-supervised setting often combine strong supervised models with additional learning objectives to make use of unlabeled data. This results in powerful though very complex models that are hard to train and that demand additional labels for optimal parameter tuning, which are often not given when labeled data is very sparse. We here study a minimalistic multi-layer generative neural network for semi-supervised learning in a form and setting as similar to standard discriminative networks as possible. Based on normalized Poisson mixtures, we derive compact and local learning and neural activation rules. Learning and inference in the network can be scaled using standard deep learning tools for parallelized GPU implementation. With the single objective of likelihood optimization, both labeled and unlabeled data are naturally incorporated into learning. Empirical evaluations on standard benchmarks show, that for datasets with few labels the derived minimalistic network improves on all classical deep learning approaches and is competitive with their recent variants without the need of additional labels for parameter tuning. Furthermore, we find that the studied network is the best performing monolithic (`non-hybrid') system for few labels, and that it can be applied in the limit of very few labels, where no other system has been reported to operate so far.

READ FULL TEXT
research
11/30/2020

MUSCLE: Strengthening Semi-Supervised Learning Via Concurrent Unsupervised Learning Using Mutual Information Maximization

Deep neural networks are powerful, massively parameterized machine learn...
research
02/07/2017

Truncated Variational EM for Semi-Supervised Neural Simpletrons

Inference and learning for probabilistic generative networks is often ve...
research
12/04/2020

Matching Distributions via Optimal Transport for Semi-Supervised Learning

Semi-Supervised Learning (SSL) approaches have been an influential frame...
research
11/22/2016

Max-Margin Deep Generative Models for (Semi-)Supervised Learning

Deep generative models (DGMs) are effective on learning multilayered rep...
research
06/29/2021

Semi-supervised learning with Bayesian Confidence Propagation Neural Network

Learning internal representations from data using no or few labels is us...
research
10/20/2021

Model Composition: Can Multiple Neural Networks Be Combined into a Single Network Using Only Unlabeled Data?

The diversity of deep learning applications, datasets, and neural networ...
research
08/14/2017

A learning framework for winner-take-all networks with stochastic synapses

Many recent generative models make use of neural networks to transform t...

Please sign up or login with your details

Forgot password? Click here to reset