Accelerating Message Passing for MAP with Benders Decomposition
We introduce a novel mechanism to tighten the local polytope relaxation for MAP inference in Markov random fields with low state space variables. We consider a surjection of the variables to a set of hyper-variables and apply the local polytope relaxation over these hyper-variables. The state space of each individual hyper-variable is constructed to be enumerable while the vector product of pairs is not easily enumerable making message passing inference intractable. To circumvent the difficulty of enumerating the vector product of state spaces of hyper-variables we introduce a novel Benders decomposition approach. This produces an upper envelope describing the message constructed from affine functions of the individual variables that compose the hyper-variable receiving the message. The envelope is tight at the minimizers which are shared by the true message. Benders rows are constructed to be Pareto optimal and are generated using an efficient procedure targeted for binary problems.
READ FULL TEXT