Learning Stochastic Recurrent Networks

11/27/2014
by   Justin Bayer, et al.
0

Leveraging advances in variational inference, we propose to enhance recurrent neural networks with latent variables, resulting in Stochastic Recurrent Networks (STORNs). The model i) can be trained with stochastic gradient methods, ii) allows structured and multi-modal conditionals at each time step, iii) features a reliable estimator of the marginal likelihood and iv) is a generalisation of deterministic recurrent neural networks. We evaluate the method on four polyphonic musical data sets and motion capture data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2016

Sequential Neural Models with Stochastic Layers

How can we efficiently propagate uncertainty in a latent state represent...
research
10/28/2019

Semi-Implicit Stochastic Recurrent Neural Networks

Stochastic recurrent neural networks with latent random variables of com...
research
11/30/2017

A Neural Stochastic Volatility Model

In this paper, we show that the recent integration of statistical models...
research
11/15/2017

Z-Forcing: Training Stochastic Recurrent Networks

Many efforts have been devoted to training generative latent variable mo...
research
07/05/2019

A Unified Framework of Online Learning Algorithms for Training Recurrent Neural Networks

We present a framework for compactly summarizing many recent results in ...
research
03/19/2023

Dynamical Hyperspectral Unmixing with Variational Recurrent Neural Networks

Multitemporal hyperspectral unmixing (MTHU) is a fundamental tool in the...
research
10/20/2020

Variational Dynamic Mixtures

Deep probabilistic time series forecasting models have become an integra...

Please sign up or login with your details

Forgot password? Click here to reset