LPQP for MAP: Putting LP Solvers to Better Use

by   Patrick Pletscher, et al.

MAP inference for general energy functions remains a challenging problem. While most efforts are channeled towards improving the linear programming (LP) based relaxation, this work is motivated by the quadratic programming (QP) relaxation. We propose a novel MAP relaxation that penalizes the Kullback-Leibler divergence between the LP pairwise auxiliary variables, and QP equivalent terms given by the product of the unaries. We develop two efficient algorithms based on variants of this relaxation. The algorithms minimize the non-convex objective using belief propagation and dual decomposition as building blocks. Experiments on synthetic and real-world data show that the solutions returned by our algorithms substantially improve over the LP relaxation.


Belief Propagation for Linear Programming

Belief Propagation (BP) is a popular, distributed heuristic for performi...

On Partial Opimality by Auxiliary Submodular Problems

In this work, we prove several relations between three different energy ...

Compact Relaxations for MAP Inference in Pairwise MRFs with Piecewise Linear Priors

Label assignment problems with large state spaces are important tasks es...

Efficient Algorithms for Global Inference in Internet Marketplaces

Matching demand to supply in internet marketplaces (e-commerce, ride-sha...

Learning Optimal Parameters for Multi-target Tracking with Contextual Interactions

We describe an end-to-end framework for learning parameters of min-cost ...

Fast and Complete: Enabling Complete Neural Network Verification with Rapid and Massively Parallel Incomplete Verifiers

Formal verification of neural networks (NNs) is a challenging and import...

Block Stability for MAP Inference

To understand the empirical success of approximate MAP inference, recent...

Code Repositories


Combined LP and QP relaxation for MAP inference

view repo