Unsupervised Neural Hidden Markov Models with a Continuous latent state space

06/10/2021
by   Firas Jarboui, et al.
0

We introduce a new procedure to neuralize unsupervised Hidden Markov Models in the continuous case. This provides higher flexibility to solve problems with underlying latent variables. This approach is evaluated on both synthetic and real data. On top of generating likely model parameters with comparable performances to off-the-shelf neural architecture (LSTMs, GRUs,..), the obtained results are easily interpretable.

READ FULL TEXT

page 10

page 11

research
09/27/2020

Equivalence of Hidden Markov Models with Continuous Observations

We consider Hidden Markov Models that emit sequences of observations tha...
research
08/27/2016

Learning Temporal Dependence from Time-Series Data with Latent Variables

We consider the setting where a collection of time series, modeled as ra...
research
03/18/2021

Lossless compression with state space models using bits back coding

We generalize the 'bits back with ANS' method to time-series models with...
research
12/16/2015

Learning a Hybrid Architecture for Sequence Regression and Annotation

When learning a hidden Markov model (HMM), sequen- tial observations can...
research
08/07/2017

Generative Statistical Models with Self-Emergent Grammar of Chord Sequences

Generative statistical models of chord sequences play crucial roles in m...
research
02/23/2023

Generalization of Auto-Regressive Hidden Markov Models to Non-Linear Dynamics and Non-Euclidean Observation Space

Latent variable models are widely used to perform unsupervised segmentat...
research
11/10/2021

The modeling of multiple animals that share behavioral features

In this work, we propose a model that can be used to infer the behavior ...

Please sign up or login with your details

Forgot password? Click here to reset