Proximal MCMC for Bayesian Inference of Constrained and Regularized Estimation

05/15/2022
by   Xinkai Zhou, et al.
0

This paper advocates proximal Markov Chain Monte Carlo (ProxMCMC) as a flexible and general Bayesian inference framework for constrained or regularized estimation. Originally introduced in the Bayesian imaging literature, ProxMCMC employs the Moreau-Yosida envelope for a smooth approximation of the total-variation regularization term, fixes nuisance and regularization strength parameters as constants, and relies on the Langevin algorithm for the posterior sampling. We extend ProxMCMC to the full Bayesian framework with modeling and data adaptive estimation of all parameters including the regularization strength parameter. More efficient sampling algorithms such as the Hamiltonian Monte Carlo are employed to scale ProxMCMC to high-dimensional problems. Analogous to the proximal algorithms in optimization, ProxMCMC offers a versatile and modularized procedure for the inference of constrained and non-smooth problems. The power of ProxMCMC is illustrated on various statistical estimation and machine learning tasks. The inference in these problems is traditionally considered difficult from both frequentist and Bayesian perspectives.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset