Fast MCMC sampling for Markov jump processes and continuous time Bayesian networks

02/14/2012
by   Vinayak Rao, et al.
0

Markov jump processes and continuous time Bayesian networks are important classes of continuous time dynamical systems. In this paper, we tackle the problem of inferring unobserved paths in these models by introducing a fast auxiliary variable Gibbs sampler. Our approach is based on the idea of uniformization, and sets up a Markov chain over paths by sampling a finite set of virtual jump times and then running a standard hidden Markov model forward filtering-backward sampling algorithm over states at the set of extant and virtual jump times. We demonstrate significant computational benefits over a state-of-the-art Gibbs sampler on a number of continuous time Bayesian networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2012

Gibbs Sampling in Factorized Continuous-Time Markov Processes

A central task in many applications is reasoning about processes that ch...
research
11/19/2015

Fast Parallel SAME Gibbs Sampling on General Discrete Bayesian Networks

A fundamental task in machine learning and related fields is to perform ...
research
06/28/2020

Scalable Bayesian Multiple Changepoint Detection via Auxiliary Uniformization

By attaching auxiliary event times to the chronologically ordered observ...
research
07/01/2020

Continuous-Time Bayesian Networks with Clocks

Structured stochastic processes evolving in continuous time present a wi...
research
07/01/2020

Augmenting Continuous-Time Bayesian Networks with Clocks

Structured stochastic processes evolving in continuous time present a wi...
research
03/26/2019

An Exact Auxiliary Variable Gibbs Sampler for a Class of Diffusions

Stochastic differential equations (SDEs) or diffusions are continuous-va...
research
08/21/2023

Analyzing Complex Systems with Cascades Using Continuous-Time Bayesian Networks

Interacting systems of events may exhibit cascading behavior where event...

Please sign up or login with your details

Forgot password? Click here to reset