Newtonian Monte Carlo: single-site MCMC meets second-order gradient methods

01/15/2020
by   Nimar S. Arora, et al.
0

Single-site Markov Chain Monte Carlo (MCMC) is a variant of MCMC in which a single coordinate in the state space is modified in each step. Structured relational models are a good candidate for this style of inference. In the single-site context, second order methods become feasible because the typical cubic costs associated with these methods is now restricted to the dimension of each coordinate. Our work, which we call Newtonian Monte Carlo (NMC), is a method to improve MCMC convergence by analyzing the first and second order gradients of the target density to determine a suitable proposal density at each point. Existing first order gradient-based methods suffer from the problem of determining an appropriate step size. Too small a step size and it will take a large number of steps to converge, while a very large step size will cause it to overshoot the high density region. NMC is similar to the Newton-Raphson update in optimization where the second order gradient is used to automatically scale the step size in each dimension. However, our objective is to find a parameterized proposal density rather than the maxima. As a further improvement on existing first and second order methods, we show that random variables with constrained supports don't need to be transformed before taking a gradient step. We demonstrate the efficiency of NMC on a number of different domains. For statistical models where the prior is conjugate to the likelihood, our method recovers the posterior quite trivially in one step. However, we also show results on fairly large non-conjugate models, where NMC performs better than adaptive first order methods such as NUTS or other inexact scalable inference methods such as Stochastic Variational Inference or bootstrapping.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/29/2020

AMAGOLD: Amortized Metropolis Adjustment for Efficient Stochastic Gradient MCMC

Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is an efficient meth...
research
08/23/2022

A Stochastic Variance Reduced Gradient using Barzilai-Borwein Techniques as Second Order Information

In this paper, we consider to improve the stochastic variance reduce gra...
research
02/10/2016

Stochastic Quasi-Newton Langevin Monte Carlo

Recently, Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) methods...
research
08/02/2021

Asymptotic bias of inexact Markov Chain Monte Carlo methods in high dimension

This paper establishes non-asymptotic bounds on Wasserstein distances be...
research
03/23/2020

Markovian Score Climbing: Variational Inference with KL(p||q)

Modern variational inference (VI) uses stochastic gradients to avoid int...
research
06/17/2021

Differentially Private Hamiltonian Monte Carlo

Markov chain Monte Carlo (MCMC) algorithms have long been the main workh...
research
06/22/2011

Gaussian Process Regression with a Student-t Likelihood

This paper considers the robust and efficient implementation of Gaussian...

Please sign up or login with your details

Forgot password? Click here to reset