Layer-wise learning of deep generative models

12/07/2012
by   Ludovic Arnold, et al.
0

When using deep, multi-layered architectures to build generative models of data, it is difficult to train all layers at once. We propose a layer-wise training procedure admitting a performance guarantee compared to the global optimum. It is based on an optimistic proxy of future performance, the best latent marginal. We interpret auto-encoders in this setting as generative models, by showing that they train a lower bound of this criterion. We test the new learning procedure against a state of the art method (stacked RBMs), and find it to improve performance. Both theory and experiments highlight the importance, when training deep architectures, of using an inference model (from data to hidden variables) richer than the generative model (from hidden variables to data).

READ FULL TEXT

page 24

page 29

research
06/13/2019

Reweighted Expectation Maximization

Training deep generative models with maximum likelihood remains a challe...
research
05/06/2014

Is Joint Training Better for Deep Auto-Encoders?

Traditionally, when generative models of data are developed via deep arc...
research
01/03/2021

StarNet: Gradient-free Training of Deep Generative Models using Determined System of Linear Equations

In this paper we present an approach for training deep generative models...
research
04/01/2023

Hidden Layer Interaction: A Co-Creative Design Fiction for Generative Models

This paper presents a speculation on a fictive co-creation scenario that...
research
01/06/2018

Design Exploration of Hybrid CMOS-OxRAM Deep Generative Architectures

Deep Learning and its applications have gained tremendous interest recen...
research
10/20/2022

Graphically Structured Diffusion Models

We introduce a framework for automatically defining and learning deep ge...
research
03/20/2016

Joint Stochastic Approximation learning of Helmholtz Machines

Though with progress, model learning and performing posterior inference ...

Please sign up or login with your details

Forgot password? Click here to reset