Learning higher-order sequential structure with cloned HMMs

by   Antoine Dedieu, et al.

Variable order sequence modeling is an important problem in artificial and natural intelligence. While overcomplete Hidden Markov Models (HMMs), in theory, have the capacity to represent long-term temporal structure, they often fail to learn and converge to local minima. We show that by constraining Hidden Markov Models (HMMs) with a simple sparsity structure inspired by biology, we can make it learn variable order sequences efficiently. We call this model cloned HMM (CHMM) because the sparsity structure enforces that many hidden states map deterministically to the same emission state. CHMMs with over 1 billion parameters can be efficiently trained on GPUs without being severely affected by the credit diffusion problem of standard HMMs. Unlike n-grams and sequence memoizers, CHMMs can model temporal dependencies at arbitrarily long distances and recognize contexts with "holes" in them. Compared to Recurrent Neural Networks, CHMMs are generative models that can natively deal with uncertainty. Moreover, CHMMs return a higher-order graph that represents the temporal structure of the data which can be useful for community detection, and for building hierarchical models. Our experiments show that CHMMs can beat n-grams, sequence memoizers, and LSTMs on character-level language modeling tasks. CHMMs can be a viable alternative to these methods in some tasks that require variable order sequence modeling and the handling of uncertainty.


Higher Order Recurrent Neural Networks

In this paper, we study novel neural network structures to better model ...

Bayesian Nonparametric Higher Order Hidden Markov Models

We consider the problem of flexible modeling of higher order hidden Mark...

Hybrid hidden Markov LSTM for short-term traffic flow prediction

Deep learning (DL) methods have outperformed parametric models such as h...

Scaling Hidden Markov Language Models

The hidden Markov model (HMM) is a fundamental tool for sequence modelin...

Learning the Markov order of paths in a network

We study the problem of learning the Markov order in categorical sequenc...

Riemannian metrics for neural networks II: recurrent networks and learning symbolic data sequences

Recurrent neural networks are powerful models for sequential data, able ...

Recurrent Ladder Networks

We propose a recurrent extension of the Ladder networks whose structure ...

Please sign up or login with your details

Forgot password? Click here to reset