DeepAI AI Chat
Log In Sign Up

Stochastic Collapsed Variational Inference for Sequential Data

by   Pengyu Wang, et al.
University of Oxford

Stochastic variational inference for collapsed models has recently been successfully applied to large scale topic modelling. In this paper, we propose a stochastic collapsed variational inference algorithm in the sequential data setting. Our algorithm is applicable to both finite hidden Markov models and hierarchical Dirichlet process hidden Markov models, and to any datasets generated by emission distributions in the exponential family. Our experiment results on two discrete datasets show that our inference is both more efficient and more accurate than its uncollapsed version, stochastic variational inference.


page 1

page 2

page 3

page 4


Stochastic Collapsed Variational Inference for Hidden Markov Models

Stochastic variational inference for collapsed models has recently been ...

Stochastic Variational Inference

We develop stochastic variational inference, a scalable algorithm for ap...

Accelerometer based Activity Classification with Variational Inference on Sticky HDP-SLDS

As part of daily monitoring of human activities, wearable sensors and de...

Variational Learning in Mixed-State Dynamic Graphical Models

Many real-valued stochastic time-series are locally linear (Gassian), bu...

Consistency Analysis for the Doubly Stochastic Dirichlet Process

This technical report proves components consistency for the Doubly Stoch...

Scaling Factorial Hidden Markov Models: Stochastic Variational Inference without Messages

Factorial Hidden Markov Models (FHMMs) are powerful models for sequentia...

Private Topic Modeling

We develop a privatised stochastic variational inference method for Late...