Complexity of zigzag sampling algorithm for strongly log-concave distributions

12/21/2020
by   Jianfeng Lu, et al.
0

We study the computational complexity of zigzag sampling algorithm for strongly log-concave distributions. The zigzag process has the advantage of not requiring time discretization for implementation, and that each proposed bouncing event requires only one evaluation of partial derivative of the potential, while its convergence rate is dimension independent. Using these properties, we prove that the zigzag sampling algorithm achieves ε error in chi-square divergence with a computational cost equivalent to O(κ^2 d^1/2(log1/ε)^3/2) gradient evaluations in the regime κ≪d/log d under a warm start assumption, where κ is the condition number and d is the dimension.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2021

The query complexity of sampling from strongly log-concave distributions in one dimension

We establish the first tight lower bound of Ω(loglogκ) on the query comp...
research
02/13/2018

Stochastic Variance-Reduced Hamilton Monte Carlo Methods

We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for samp...
research
04/05/2023

Query lower bounds for log-concave sampling

Log-concave sampling has witnessed remarkable algorithmic advances in re...
research
02/27/2018

Mirrored Langevin Dynamics

We generalize the Langevin Dynamics through the mirror descent framework...
research
02/20/2023

Faster high-accuracy log-concave sampling via algorithmic warm starts

Understanding the complexity of sampling from a strongly log-concave and...
research
02/24/2018

Dimensionally Tight Running Time Bounds for Second-Order Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) is a widely deployed method to sample from...
research
05/31/2023

Conditionally Strongly Log-Concave Generative Models

There is a growing gap between the impressive results of deep image gene...

Please sign up or login with your details

Forgot password? Click here to reset