High Dimensional Inference with Random Maximum A-Posteriori Perturbations

02/10/2016
by   Tamir Hazan, et al.
0

This paper presents a new approach, called perturb-max, for high-dimensional statistical inference that is based on applying random perturbations followed by optimization. This framework injects randomness to maximum a-posteriori (MAP) predictors by randomly perturbing the potential function for the input. A classic result from extreme value statistics asserts that perturb-max operations generate unbiased samples from the Gibbs distribution using high-dimensional perturbations. Unfortunately, the computational cost of generating so many high-dimensional random variables can be prohibitive. However, when the perturbations are of low dimension, sampling the perturb-max prediction is as efficient as MAP optimization. This paper shows that the expected value of perturb-max inference with low dimensional perturbations can be used sequentially to generate unbiased samples from the Gibbs distribution. Furthermore the expected value of the maximal perturbations is a natural bound on the entropy of such perturb-max models. A measure concentration result for perturb-max values shows that the deviation of their sampled average from its expectation decays exponentially in the number of samples, allowing effective approximation of the expectation.

READ FULL TEXT
research
06/27/2012

On the Partition Function and Random Maximum A-Posteriori Perturbations

In this paper we relate the partition function to the max-statistics of ...
research
09/27/2020

Strong replica symmetry for high-dimensional disordered log-concave Gibbs measures

We consider a generic class of log-concave, possibly random, (Gibbs) mea...
research
11/03/2021

Perturb-and-max-product: Sampling and learning in discrete energy-based models

Perturb-and-MAP offers an elegant approach to approximately sample from ...
research
02/02/2020

Fast Generating A Large Number of Gumbel-Max Variables

The well-known Gumbel-Max Trick for sampling elements from a categorical...
research
11/13/2019

The Value of the High, Low and Close in the Estimation of Brownian Motion: Extended Version

The conditional density of Brownian motion is considered given the max, ...
research
08/19/2022

Estimating a potential without the agony of the partition function

Estimating a Gibbs density function given a sample is an important probl...
research
02/08/2019

Generating the support with extreme value losses

When optimizing against the mean loss over a distribution of predictions...

Please sign up or login with your details

Forgot password? Click here to reset