On tensor rank of conditional probability tables in Bayesian networks

09/22/2014
by   Jiří Vomlel, et al.
0

A difficult task in modeling with Bayesian networks is the elicitation of numerical parameters of Bayesian networks. A large number of parameters is needed to specify a conditional probability table (CPT) that has a larger parent set. In this paper we show that, most CPTs from real applications of Bayesian networks can actually be very well approximated by tables that require substantially less parameters. This observation has practical consequence not only for model elicitation but also for efficient probabilistic reasoning with these networks.

READ FULL TEXT

page 1

page 2

research
01/23/2013

Learning Bayesian Networks with Restricted Causal Interactions

A major problem for the learning of Bayesian networks (BNs) is the expon...
research
07/27/2014

Conditional Plausibility Measures and Bayesian Networks

A general notion of algebraic conditional plausibility measures is defin...
research
10/13/2018

Categorical Aspects of Parameter Learning

Parameter learning is the technique for obtaining the probabilistic para...
research
07/04/2012

Modifying Bayesian Networks by Probability Constraints

This paper deals with the following problem: modify a Bayesian network t...
research
05/29/2019

Learning Bayesian Networks with Low Rank Conditional Probability Tables

In this paper, we provide a method to learn the directed structure of a ...
research
07/23/2012

Probability Bracket Notation, Multivariable Systems and Static Bayesian Networks

Probability Bracket Notation (PBN) is applied to systems of multiple ran...
research
01/18/2013

User Interface Tools for Navigation in Conditional Probability Tables and Elicitation of Probabilities in Bayesian Networks

Elicitation of probabilities is one of the most laborious tasks in build...

Please sign up or login with your details

Forgot password? Click here to reset