A stochastic Stein Variational Newton method

04/19/2022
by   Alex Leviyev, et al.
7

Stein variational gradient descent (SVGD) is a general-purpose optimization-based sampling algorithm that has recently exploded in popularity, but is limited by two issues: it is known to produce biased samples, and it can be slow to converge on complicated distributions. A recently proposed stochastic variant of SVGD (sSVGD) addresses the first issue, producing unbiased samples by incorporating a special noise into the SVGD dynamics such that asymptotic convergence is guaranteed. Meanwhile, Stein variational Newton (SVN), a Newton-like extension of SVGD, dramatically accelerates the convergence of SVGD by incorporating Hessian information into the dynamics, but also produces biased samples. In this paper we derive, and provide a practical implementation of, a stochastic variant of SVN (sSVN) which is both asymptotically correct and converges rapidly. We demonstrate the effectiveness of our algorithm on a difficult class of test problems – the Hybrid Rosenbrock density – and show that sSVN converges using three orders of magnitude fewer gradient evaluations of the log likelihood than its stochastic SVGD counterpart. Our results show that sSVN is a promising approach to accelerating high-precision Bayesian inference tasks with modest-dimension, d∼𝒪(10).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2018

A Stein variational Newton method

Stein variational gradient descent (SVGD) was recently proposed as a gen...
research
07/06/2020

Stochastic Stein Discrepancies

Stein discrepancies (SDs) monitor convergence and non-convergence in app...
research
05/13/2019

A Stochastic Gradient Method with Biased Estimation for Faster Nonconvex Optimization

A number of optimization approaches have been proposed for optimizing no...
research
02/13/2020

Stochastic Approximate Gradient Descent via the Langevin Algorithm

We introduce a novel and efficient algorithm called the stochastic appro...
research
07/06/2020

Kernel Stein Generative Modeling

We are interested in gradient-based Explicit Generative Modeling where s...
research
06/19/2021

Rayleigh-Gauss-Newton optimization with enhanced sampling for variational Monte Carlo

Variational Monte Carlo (VMC) is an approach for computing ground-state ...
research
04/27/2021

Discriminative Bayesian Filtering Lends Momentum to the Stochastic Newton Method for Minimizing Log-Convex Functions

To minimize the average of a set of log-convex functions, the stochastic...

Please sign up or login with your details

Forgot password? Click here to reset