DeepAI
Log In Sign Up

Mix-nets: Factored Mixtures of Gaussians in Bayesian Networks With Mixed Continuous And Discrete Variables

01/16/2013
by   Scott Davies, et al.
0

Recently developed techniques have made it possible to quickly learn accurate probability density functions from data in low-dimensional continuous space. In particular, mixtures of Gaussians can be fitted to data very quickly using an accelerated EM algorithm that employs multiresolution kd-trees (Moore, 1999). In this paper, we propose a kind of Bayesian networks in which low-dimensional mixtures of Gaussians over different subsets of the domain's variables are combined into a coherent joint probability model over the entire domain. The network is also capable of modeling complex dependencies between discrete variables and continuous variables without requiring discretization of the continuous variables. We present efficient heuristic algorithms for automatically learning these networks from data, and perform comparative experiments illustrated how well these networks model real scientific data and synthetic data. We also briefly discuss some possible improvements to the networks, as well as possible applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

07/04/2012

Hybrid Bayesian Networks with Linear Deterministic Variables

When a hybrid Bayesian network has conditionally deterministic variables...
12/12/2012

Interpolating Conditional Density Trees

Joint distributions over many variables are frequently modeled by decomp...
01/10/2013

Estimating Well-Performing Bayesian Networks using Bernoulli Mixtures

A novel method for estimating Bayesian network (BN) parameters from data...
01/22/2019

Solving All Regression Models For Learning Gaussian Networks Using Givens Rotations

Score based learning (SBL) is a promising approach for learning Bayesian...