DeepAI AI Chat
Log In Sign Up

The Computational Power of Dynamic Bayesian Networks

03/19/2016
by   Joshua Brulé, et al.
University of Maryland
0

This paper considers the computational power of constant size, dynamic Bayesian networks. Although discrete dynamic Bayesian networks are no more powerful than hidden Markov models, dynamic Bayesian networks with continuous random variables and discrete children of continuous parents are capable of performing Turing-complete computation. With modified versions of existing algorithms for belief propagation, such a simulation can be carried out in real time. This result suggests that dynamic Bayesian networks may be more powerful than previously considered. Relationships to causal models and recurrent neural networks are also discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/03/2010

Probabilistic Inferences in Bayesian Networks

Bayesian network is a complete model for the variables and their relatio...
01/23/2013

Approximate Learning in Complex Dynamic Bayesian Networks

In this paper we extend the work of Smith and Papamichail (1999) and pre...
06/13/2012

Identifying Dynamic Sequential Plans

We address the problem of identifying dynamic sequential plans in the fr...
01/16/2020

A Critical Look at the Applicability of Markov Logic Networks for Music Signal Analysis

In recent years, Markov logic networks (MLNs) have been proposed as a po...
10/18/2016

Identifiability and Transportability in Dynamic Causal Networks

In this paper we propose a causal analog to the purely observational Dyn...
02/05/2021

Discrete Max-Linear Bayesian Networks

Discrete max-linear Bayesian networks are directed graphical models spec...
01/16/2013

Dynamic Bayesian Multinets

In this work, dynamic Bayesian multinets are introduced where a Markov c...