Log In Sign Up

Variable Elimination in the Fourier Domain

by   Yexiang Xue, et al.

The ability to represent complex high dimensional probability distributions in a compact form is one of the key insights in the field of graphical models. Factored representations are ubiquitous in machine learning and lead to major computational advantages. We explore a different type of compact representation based on discrete Fourier representations, complementing the classical approach based on conditional independencies. We show that a large class of probabilistic graphical models have a compact Fourier representation. This theoretical result opens up an entirely new way of approximating a probability distribution. We demonstrate the significance of this approach by applying it to the variable elimination algorithm. Compared with the traditional bucket representation and other approximate inference algorithms, we obtain significant improvements.


Tensor Variable Elimination for Plated Factor Graphs

A wide class of machine learning algorithms can be reduced to variable e...

A Concise Function Representation for Faster Exact MPE and Constrained Optimisation in Graphical Models

We propose a novel concise function representation for graphical models,...

An Efficient Algorithm for Computing Interventional Distributions in Latent Variable Causal Models

Probabilistic inference in graphical models is the task of computing mar...

Joint Distributions for TensorFlow Probability

A central tenet of probabilistic programming is that a model is specifie...

Adversarially-learned Inference via an Ensemble of Discrete Undirected Graphical Models

Undirected graphical models are compact representations of joint probabi...

Discriminative models for robust image classification

A variety of real-world tasks involve the classification of images into ...

A first approach to closeness distributions

Probabilistic graphical models allow us to encode a large probability di...