Variational Hyper RNN for Sequence Modeling

02/24/2020
by   Ruizhi Deng, et al.
23

In this work, we propose a novel probabilistic sequence model that excels at capturing high variability in time series data, both across sequences and within an individual sequence. Our method uses temporal latent variables to capture information about the underlying data pattern and dynamically decodes the latent information into modifications of weights of the base decoder and recurrent model. The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data that exhibit large scale variations, regime shifts, and complex dynamics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2023

Time series anomaly detection with sequence reconstruction based state-space model

Recent advances in digitization has led to availability of multivariate ...
research
05/29/2019

Flexible Mining of Prefix Sequences from Time-Series Traces

Mining temporal assertions from time-series data using information theor...
research
11/15/2017

Z-Forcing: Training Stochastic Recurrent Networks

Many efforts have been devoted to training generative latent variable mo...
research
12/20/2014

Variational Recurrent Auto-Encoders

In this paper we propose a model that combines the strengths of RNNs and...
research
09/21/2023

Variational Connectionist Temporal Classification for Order-Preserving Sequence Modeling

Connectionist temporal classification (CTC) is commonly adopted for sequ...
research
02/28/2020

Learning Multivariate Hawkes Processes at Scale

Multivariate Hawkes Processes (MHPs) are an important class of temporal ...
research
11/04/2018

Learning to Embed Probabilistic Structures Between Deterministic Chaos and Random Process in a Variational Bayes Predictive-Coding RNN

This study introduces a stochastic predictive-coding RNN model that can ...

Please sign up or login with your details

Forgot password? Click here to reset