DeepAI
Log In Sign Up

Learning Stochastic Recurrent Networks

11/27/2014
by   Justin Bayer, et al.
0

Leveraging advances in variational inference, we propose to enhance recurrent neural networks with latent variables, resulting in Stochastic Recurrent Networks (STORNs). The model i) can be trained with stochastic gradient methods, ii) allows structured and multi-modal conditionals at each time step, iii) features a reliable estimator of the marginal likelihood and iv) is a generalisation of deterministic recurrent neural networks. We evaluate the method on four polyphonic musical data sets and motion capture data.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/24/2016

Sequential Neural Models with Stochastic Layers

How can we efficiently propagate uncertainty in a latent state represent...
10/28/2019

Semi-Implicit Stochastic Recurrent Neural Networks

Stochastic recurrent neural networks with latent random variables of com...
11/30/2017

A Neural Stochastic Volatility Model

In this paper, we show that the recent integration of statistical models...
11/15/2017

Z-Forcing: Training Stochastic Recurrent Networks

Many efforts have been devoted to training generative latent variable mo...
07/05/2019

A Unified Framework of Online Learning Algorithms for Training Recurrent Neural Networks

We present a framework for compactly summarizing many recent results in ...
05/18/2018

Approximate Bayesian inference in spatial environments

We propose to learn a stochastic recurrent model to solve the problem of...
10/20/2020

Variational Dynamic Mixtures

Deep probabilistic time series forecasting models have become an integra...

Code Repositories

PyGotham2015

Slides for PyGotham2015


view repo