Parameter Adjustment in Bayes Networks. The generalized noisy OR-gate

03/06/2013
by   Francisco Javier Diez, et al.
0

Spiegelhalter and Lauritzen [15] studied sequential learning in Bayesian networks and proposed three models for the representation of conditional probabilities. A forth model, shown here, assumes that the parameter distribution is given by a product of Gaussian functions and updates them from the _ and _r messages of evidence propagation. We also generalize the noisy OR-gate for multivalued variables, develop the algorithm to compute probability in time proportional to the number of parents (even in networks with loops) and apply the learning model to this gate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2013

A Generalization of the Noisy-Or Model

The Noisy-Or model is convenient for describing a class of uncertain rel...
research
08/08/2017

Using JAGS for Bayesian Cognitive Diagnosis Models: A Tutorial

In this article, JAGS software was systematically introduced to fit comm...
research
07/11/2012

A New Characterization of Probabilities in Bayesian Networks

We characterize probabilities in Bayesian networks in terms of algebraic...
research
08/20/2018

A Distribution Similarity Based Regularizer for Learning Bayesian Networks

Probabilistic graphical models compactly represent joint distributions b...
research
11/22/2022

OR-Gate: A Noisy Label Filtering Method for Speaker Verification

The deep learning models used for speaker verification are heavily depen...
research
03/06/2013

Diagnosis of Multiple Faults: A Sensitivity Analysis

We compare the diagnostic accuracy of three diagnostic inference models:...
research
03/08/2023

Flexible and slim device switching air blowing and suction by a single airflow control

This study proposes a soft robotic device with a slim and flexible body ...

Please sign up or login with your details

Forgot password? Click here to reset