DeepAI AI Chat
Log In Sign Up

Exploiting Functional Dependence in Bayesian Network Inference

by   Jirka Vomlel, et al.

We propose an efficient method for Bayesian network inference in models with functional dependence. We generalize the multiplicative factorization method originally designed by Takikawa and D Ambrosio(1999) FOR models WITH independence OF causal influence.Using a hidden variable, we transform a probability potential INTO a product OF two - dimensional potentials.The multiplicative factorization yields more efficient inference. FOR example, IN junction tree propagation it helps TO avoid large cliques. IN ORDER TO keep potentials small, the number OF states OF the hidden variable should be minimized.We transform this problem INTO a combinatorial problem OF minimal base IN a particular space.We present an example OF a computerized adaptive test, IN which the factorization method IS significantly more efficient than previous inference methods.


page 1

page 6

page 7


Independence of Causal Influence and Clique Tree Propagation

This paper explores the role of independence of causal influence (ICI) i...

Exploiting Causal Independence in Bayesian Network Inference

A new method is proposed for exploiting causal independencies in exact B...

Propagation using Chain Event Graphs

A Chain Event Graph (CEG) is a graphial model which designed to embody c...

Approximation by Quantization

Inference in graphical models consists of repeatedly multiplying and sum...

Multiplicative Factorization of Noisy-Max

The noisy-or and its generalization noisy-max have been utilized to redu...

Probabilistic Argumentation and Information Algebras of Probability Potentials on Families of Compatible Frames

Probabilistic argumentation is an alternative to causal modeling with Ba...

Continuous Multiclass Labeling Approaches and Algorithms

We study convex relaxations of the image labeling problem on a continuou...