Recursive Bayesian Networks: Generalising and Unifying Probabilistic Context-Free Grammars and Dynamic Bayesian Networks

11/02/2021
by   Robert Lieck, et al.
20

Probabilistic context-free grammars (PCFGs) and dynamic Bayesian networks (DBNs) are widely used sequence models with complementary strengths and limitations. While PCFGs allow for nested hierarchical dependencies (tree structures), their latent variables (non-terminal symbols) have to be discrete. In contrast, DBNs allow for continuous latent variables, but the dependencies are strictly sequential (chain structure). Therefore, neither can be applied if the latent variables are assumed to be continuous and also to have a nested hierarchical dependency structure. In this paper, we present Recursive Bayesian Networks (RBNs), which generalise and unify PCFGs and DBNs, combining their strengths and containing both as special cases. RBNs define a joint distribution over tree-structured Bayesian networks with discrete or continuous latent variables. The main challenge lies in performing joint inference over the exponential number of possible structures and the continuous variables. We provide two solutions: 1) For arbitrary RBNs, we generalise inside and outside probabilities from PCFGs to the mixed discrete-continuous case, which allows for maximum posterior estimates of the continuous latent variables via gradient descent, while marginalising over network structures. 2) For Gaussian RBNs, we additionally derive an analytic approximation, allowing for robust parameter optimisation and Bayesian inference. The capacity and diverse applications of RBNs are illustrated on two examples: In a quantitative evaluation on synthetic data, we demonstrate and discuss the advantage of RBNs for segmentation and tree induction from noisy sequences, compared to change point detection and hierarchical clustering. In an application to musical data, we approach the unsolved problem of hierarchical music analysis from the raw note level and compare our results to expert annotations.

READ FULL TEXT

page 9

page 29

page 30

page 31

page 32

page 33

page 34

page 35

research
07/04/2012

Hybrid Bayesian Networks with Linear Deterministic Variables

When a hybrid Bayesian network has conditionally deterministic variables...
research
01/23/2013

A Variational Approximation for Bayesian Networks with Discrete and Continuous Latent Variables

We show how to use a variational approximation to the logistic function ...
research
01/09/2015

Margins of discrete Bayesian networks

Bayesian network models with latent variables are widely used in statist...
research
04/23/2022

SIReN-VAE: Leveraging Flows and Amortized Inference for Bayesian Networks

Initial work on variational autoencoders assumed independent latent vari...
research
04/13/2022

Grand canonical ensembles of sparse networks and Bayesian inference

Maximum entropy network ensembles have been very successful in modelling...
research
01/16/2013

Mix-nets: Factored Mixtures of Gaussians in Bayesian Networks With Mixed Continuous And Discrete Variables

Recently developed techniques have made it possible to quickly learn acc...
research
01/16/2020

A Critical Look at the Applicability of Markov Logic Networks for Music Signal Analysis

In recent years, Markov logic networks (MLNs) have been proposed as a po...

Please sign up or login with your details

Forgot password? Click here to reset