Learning Non-deterministic Representations with Energy-based Ensembles

12/23/2014
by   Maruan Al-Shedivat, et al.
0

The goal of a generative model is to capture the distribution underlying the data, typically through latent variables. After training, these variables are often used as a new representation, more effective than the original features in a variety of learning tasks. However, the representations constructed by contemporary generative models are usually point-wise deterministic mappings from the original feature space. Thus, even with representations robust to class-specific transformations, statistically driven models trained on them would not be able to generalize when the labeled data is scarce. Inspired by the stochasticity of the synaptic connections in the brain, we introduce Energy-based Stochastic Ensembles. These ensembles can learn non-deterministic representations, i.e., mappings from the feature space to a family of distributions in the latent space. These mappings are encoded in a distribution over a (possibly infinite) collection of models. By conditionally sampling models from the ensemble, we obtain multiple representations for every input example and effectively augment the data. We propose an algorithm similar to contrastive divergence for training restricted Boltzmann stochastic ensembles. Finally, we demonstrate the concept of the stochastic representations on a synthetic dataset as well as test them in the one-shot learning scenario on MNIST.

READ FULL TEXT
research
12/26/2019

OCCER- One-Class Classification by Ensembles of Regression models

One-class classification (OCC) deals with the classification problem in ...
research
10/02/2020

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

While variational autoencoders have been successful generative models fo...
research
05/21/2023

Exploring How Generative Adversarial Networks Learn Phonological Representations

This paper explores how Generative Adversarial Networks (GANs) learn rep...
research
12/24/2020

RBM-Flow and D-Flow: Invertible Flows with Discrete Energy Base Spaces

Efficient sampling of complex data distributions can be achieved using t...
research
06/23/2022

Disentangling representations in Restricted Boltzmann Machines without adversaries

A goal of unsupervised machine learning is to disentangle representation...
research
11/14/2015

Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines

Recent studies have shown that synaptic unreliability is a robust and su...
research
10/07/2020

Conditional Generative Modeling via Learning the Latent Space

Although deep learning has achieved appealing results on several machine...

Please sign up or login with your details

Forgot password? Click here to reset