DeepAI AI Chat
Log In Sign Up

The Computational Power of Dynamic Bayesian Networks

by   Joshua Brulé, et al.
University of Maryland

This paper considers the computational power of constant size, dynamic Bayesian networks. Although discrete dynamic Bayesian networks are no more powerful than hidden Markov models, dynamic Bayesian networks with continuous random variables and discrete children of continuous parents are capable of performing Turing-complete computation. With modified versions of existing algorithms for belief propagation, such a simulation can be carried out in real time. This result suggests that dynamic Bayesian networks may be more powerful than previously considered. Relationships to causal models and recurrent neural networks are also discussed.


page 1

page 2

page 3

page 4


Probabilistic Inferences in Bayesian Networks

Bayesian network is a complete model for the variables and their relatio...

Approximate Learning in Complex Dynamic Bayesian Networks

In this paper we extend the work of Smith and Papamichail (1999) and pre...

Identifying Dynamic Sequential Plans

We address the problem of identifying dynamic sequential plans in the fr...

A Critical Look at the Applicability of Markov Logic Networks for Music Signal Analysis

In recent years, Markov logic networks (MLNs) have been proposed as a po...

Identifiability and Transportability in Dynamic Causal Networks

In this paper we propose a causal analog to the purely observational Dyn...

Discrete Max-Linear Bayesian Networks

Discrete max-linear Bayesian networks are directed graphical models spec...

Dynamic Bayesian Multinets

In this work, dynamic Bayesian multinets are introduced where a Markov c...