Multi-Modal Mean-Fields via Cardinality-Based Clamping

11/23/2016
by   Pierre Baqué, et al.
0

Mean Field inference is central to statistical physics. It has attracted much interest in the Computer Vision community to efficiently solve problems expressible in terms of large Conditional Random Fields. However, since it models the posterior probability distribution as a product of marginal probabilities, it may fail to properly account for important dependencies between variables. We therefore replace the fully factorized distribution of Mean Field by a weighted mixture of such distributions, that similarly minimizes the KL-Divergence to the true posterior. By introducing two new ideas, namely, conditioning on groups of variables instead of single ones and using a parameter of the conditional random field potentials, that we identify to the temperature in the sense of statistical physics to select such groups, we can perform this minimization efficiently. Our extension of the clamping method proposed in previous works allows us to both produce a more descriptive approximation of the true posterior and, inspired by the diverse MAP paradigms, fit a mixture of Mean Field approximations. We demonstrate that this positively impacts real-world algorithms that initially relied on mean fields.

READ FULL TEXT

page 5

page 14

page 26

research
11/04/2019

Statistical Inference in Mean-Field Variational Bayes

We conduct non-asymptotic analysis on the mean-field variational inferen...
research
02/16/2018

The Mean-Field Approximation: Information Inequalities, Algorithms, and Complexity

The mean field approximation to the Ising model is a canonical variation...
research
09/03/2021

Variational Bayes algorithm and posterior consistency of Ising model parameter estimation

Ising models originated in statistical physics and are widely used in mo...
research
08/22/2016

Efficient Continuous Relaxations for Dense CRF

Dense conditional random fields (CRF) with Gaussian pairwise potentials ...
research
01/30/2013

Mixture Representations for Inference and Learning in Boltzmann Machines

Boltzmann machines are undirected graphical models with two-state stocha...
research
03/29/2018

Copula Variational Bayes inference via information geometry

Variational Bayes (VB), also known as independent mean-field approximati...
research
01/23/2013

Mixture Approximations to Bayesian Networks

Structure and parameters in a Bayesian network uniquely specify the prob...

Please sign up or login with your details

Forgot password? Click here to reset