Re-examination of the Role of Latent Variables in Sequence Modeling

02/04/2019
by   Zihang Dai, et al.
0

With latent variables, stochastic recurrent models have achieved state-of-the-art performance in modeling sound-wave sequence. However, opposite results are also observed in other domains, where standard recurrent networks often outperform stochastic models. To better understand this discrepancy, we re-examine the roles of latent variables in stochastic recurrent models for speech density estimation. Our analysis reveals that under the restriction of fully factorized output distribution in previous evaluations, the stochastic models were implicitly leveraging intra-step correlation but the standard recurrent baselines were prohibited to do so, resulting in an unfair comparison. To correct the unfairness, we remove such restriction in our re-examination, where all the models can explicitly leverage intra-step correlation with an auto-regressive structure. Over a diverse set of sequential data, including human speech, MIDI music, handwriting trajectory and frame-permuted speech, our results show that stochastic recurrent models fail to exhibit any practical advantage despite the claimed theoretical superiority. In contrast, standard recurrent models equipped with an auto-regressive output distribution consistently perform better, significantly advancing the state-of-the-art results on three speech datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2018

Stochastic WaveNet: A Generative Latent Variable Model for Sequential Data

How to model distribution of sequential data, including but not limited ...
research
11/15/2017

Z-Forcing: Training Stochastic Recurrent Networks

Many efforts have been devoted to training generative latent variable mo...
research
02/22/2022

Benchmarking Generative Latent Variable Models for Speech

Stochastic latent variable models (LVMs) achieve state-of-the-art perfor...
research
05/24/2016

Sequential Neural Models with Stochastic Layers

How can we efficiently propagate uncertainty in a latent state represent...
research
08/02/2017

Deep Recurrent Generative Decoder for Abstractive Text Summarization

We propose a new framework for abstractive text summarization based on a...
research
04/20/2022

Stability Preserving Data-driven Models With Latent Dynamics

In this paper, we introduce a data-driven modeling approach for dynamics...

Please sign up or login with your details

Forgot password? Click here to reset