Log In Sign Up

Lifted Relational Variational Inference

by   Jaesik Choi, et al.

Hybrid continuous-discrete models naturally represent many real-world applications in robotics, finance, and environmental engineering. Inference with large-scale models is challenging because relational structures deteriorate rapidly during inference with observations. The main contribution of this paper is an efficient relational variational inference algorithm that factors largescale probability models into simpler variational models, composed of mixtures of iid (Bernoulli) random variables. The algorithm takes probability relational models of largescale hybrid systems and converts them to a close-to-optimal variational models. Then, it efficiently calculates marginal probabilities on the variational models by using a latent (or lifted) variable elimination or a lifted stochastic sampling. This inference is unique because it maintains the relational structure upon individual observations and during inference steps.


page 1

page 2

page 3

page 4


Lifted Inference for Relational Continuous Models

Relational Continuous Models (RCMs) represent joint probability densitie...

Lifted Marginal MAP Inference

Lifted inference reduces the complexity of inference in relational proba...

Lifted Hybrid Variational Inference

A variety of lifted inference algorithms, which exploit model symmetry t...

Variational Probabilistic Inference and the QMR-DT Network

We describe a variational approximation method for efficient inference i...

Local Expectation Gradients for Doubly Stochastic Variational Inference

We introduce local expectation gradients which is a general purpose stoc...

Factorized Fusion Shrinkage for Dynamic Relational Data

Modern data science applications often involve complex relational data w...

A Complete Characterization of Projectivity for Statistical Relational Models

A generative probabilistic model for relational data consists of a famil...