Tighter Linear Program Relaxations for High Order Graphical Models

09/26/2013
by   Elad Mezuman, et al.
0

Graphical models with High Order Potentials (HOPs) have received considerable interest in recent years. While there are a variety of approaches to inference in these models, nearly all of them amount to solving a linear program (LP) relaxation with unary consistency constraints between the HOP and the individual variables. In many cases, the resulting relaxations are loose, and in these cases the results of inference can be poor. It is thus desirable to look for more accurate ways of performing inference in these models. In this work, we study the LP relaxations that result from enforcing additional consistency constraints between the HOP and the rest of the model. We address theoretical questions about the strength of the resulting relaxations compared to the relaxations that arise in standard approaches, and we develop practical and efficient message passing algorithms for optimizing the LPs. Empirically, we show that the LPs with additional consistency constraints lead to more accurate inference on some challenging problems that include a combination of low order and high order terms.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/13/2012

Tightening LP Relaxations for MAP using Message Passing

Linear Programming (LP) relaxations have become powerful tools for findi...
02/14/2012

Distributed Anytime MAP Inference

We present a distributed anytime algorithm for performing MAP inference ...
12/17/2013

Constraint Reduction using Marginal Polytope Diagrams for MAP LP Relaxations

LP relaxation-based message passing algorithms provide an effective tool...
07/29/2009

Collaborative Training in Sensor Networks: A graphical model approach

Graphical models have been widely applied in solving distributed inferen...
11/24/2021

Efficient semidefinite bounds for multi-label discrete graphical models

By concisely representing a joint function of many variables as the comb...
01/13/2020

LP-SparseMAP: Differentiable Relaxed Optimization for Sparse Structured Prediction

Structured prediction requires manipulating a large number of combinator...
05/09/2012

Constraint Processing in Lifted Probabilistic Inference

First-order probabilistic models combine representational power of first...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.