DeepAI AI Chat
Log In Sign Up

Recurrent Exponential-Family Harmoniums without Backprop-Through-Time

by   Joseph G. Makin, et al.

Exponential-family harmoniums (EFHs), which extend restricted Boltzmann machines (RBMs) from Bernoulli random variables to other exponential families (Welling et al., 2005), are generative models that can be trained with unsupervised-learning techniques, like contrastive divergence (Hinton et al. 2006; Hinton, 2002), as density estimators for static data. Methods for extending RBMs--and likewise EFHs--to data with temporal dependencies have been proposed previously (Sutskever and Hinton, 2007; Sutskever et al., 2009), the learning procedure being validated by qualitative assessment of the generative model. Here we propose and justify, from a very different perspective, an alternative training procedure, proving sufficient conditions for optimal inference under that procedure. The resulting algorithm can be learned with only forward passes through the data--backprop-through-time is not required, as in previous approaches. The proof exploits a recent result about information retention in density estimators (Makin and Sabes, 2015), and applies it to a "recurrent EFH" (rEFH) by induction. Finally, we demonstrate optimality by simulation, testing the rEFH: (1) as a filter on training data generated with a linear dynamical system, the position of which is noisily reported by a population of "neurons" with Poisson-distributed spike counts; and (2) with the qualitative experiments proposed by Sutskever et al. (2009).


page 1

page 2

page 3

page 4


Asymptotic Breakdown Point Analysis for a General Class of Minimum Divergence Estimators

Robust inference based on the minimization of statistical divergences ha...

Generalized Root Models: Beyond Pairwise Graphical Models for Univariate Exponential Families

We present a novel k-way high-dimensional graphical model called the Gen...

E-values for k-Sample Tests With Exponential Families

We develop and compare e-variables for testing whether k samples of data...

A Neighbourhood-Based Stopping Criterion for Contrastive Divergence Learning

Restricted Boltzmann Machines (RBMs) are general unsupervised learning d...

Inference and Learning for Generative Capsule Models

Capsule networks (see e.g. Hinton et al., 2018) aim to encode knowledge ...

Exponential Family Estimation via Adversarial Dynamics Embedding

We present an efficient algorithm for maximum likelihood estimation (MLE...

Approximations for STERGMs Based on Cross-Sectional Data

Temporal exponential-family random graph models (TERGMs) are a flexible ...