Algorithmic Theory of ODEs and Sampling from Well-conditioned Logconcave Densities

12/15/2018
by   Yin Tat Lee, et al.
0

Sampling logconcave functions arising in statistics and machine learning has been a subject of intensive study. Recent developments include analyses for Langevin dynamics and Hamiltonian Monte Carlo (HMC). While both approaches have dimension-independent bounds for the underlying continuous processes under sufficiently strong smoothness conditions, the resulting discrete algorithms have complexity and number of function evaluations growing with the dimension. Motivated by this problem, in this paper, we give a general algorithm for solving multivariate ordinary differential equations whose solution is close to the span of a known basis of functions (e.g., polynomials or piecewise polynomials). The resulting algorithm has polylogarithmic depth and essentially tight runtime - it is nearly linear in the size of the representation of the solution. We apply this to the sampling problem to obtain a nearly linear implementation of HMC for a broad class of smooth, strongly logconcave densities, with the number of iterations (parallel depth) and gradient evaluations being polylogarithmic in the dimension (rather than polynomial as in previous work). This class includes the widely-used loss function for logistic regression with incoherent weight matrices and has been subject of much study recently. We also give a faster algorithm with polylogarithmic depth for the more general and standard class of strongly convex functions with Lipschitz gradient. These results are based on (1) an improved contraction bound for the exact HMC process and (2) logarithmic bounds on the degree of polynomials that approximate solutions of the differential equations arising in implementing HMC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2019

The Randomized Midpoint Method for Log-Concave Sampling

Sampling from log-concave distributions is a well researched problem tha...
research
06/10/2021

Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions

We give lower bounds on the performance of two of the most popular sampl...
research
04/20/2023

Understanding Accelerated Gradient Methods: Lyapunov Analyses and Hamiltonian Assisted Interpretations

We formulate two classes of first-order algorithms more general than pre...
research
08/28/2019

High-Order Langevin Diffusion Yields an Accelerated MCMC Algorithm

We propose a Markov chain Monte Carlo (MCMC) algorithm based on third-or...
research
02/08/2023

Beating binary powering for polynomial matrices

The Nth power of a polynomial matrix of fixed size and degree can be com...
research
11/01/2017

Optimizing quantum optimization algorithms via faster quantum gradient computation

We consider a generic framework of optimization algorithms based on grad...
research
04/25/2020

Low-Degree Hardness of Random Optimization Problems

We consider the problem of finding nearly optimal solutions of optimizat...

Please sign up or login with your details

Forgot password? Click here to reset