Schrödinger-Föllmer Sampler: Sampling without Ergodicity

06/21/2021
by   Jian Huang, et al.
0

Sampling from probability distributions is an important problem in statistics and machine learning, specially in Bayesian inference when integration with respect to posterior distribution is intractable and sampling from the posterior is the only viable option for inference. In this paper, we propose Schrödinger-Föllmer sampler (SFS), a novel approach for sampling from possibly unnormalized distributions. The proposed SFS is based on the Schrödinger-Föllmer diffusion process on the unit interval with a time dependent drift term, which transports the degenerate distribution at time zero to the target distribution at time one. Comparing with the existing Markov chain Monte Carlo samplers that require ergodicity, no such requirement is needed for SFS. Computationally, SFS can be easily implemented using the Euler-Maruyama discretization. In theoretical analysis, we establish non-asymptotic error bounds for the sampling distribution of SFS in the Wasserstein distance under suitable conditions. We conduct numerical experiments to evaluate the performance of SFS and demonstrate that it is able to generate samples with better quality than several existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/10/2021

Convergence Analysis of Schrödinger-Föllmer Sampler without Convexity

Schrödinger-Föllmer sampler (SFS) is a novel and efficient approach for ...
research
10/09/2011

Asymptotically Independent Markov Sampling: a new MCMC scheme for Bayesian Inference

In Bayesian statistics, many problems can be expressed as the evaluation...
research
03/16/2021

Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-Differentiable Priors

The use of non-differentiable priors in Bayesian statistics has become i...
research
09/21/2022

Single chain differential evolution Monte-Carlo for self-tuning Bayesian inference

1. Bayesian inference is difficult because it often requires time consum...
research
06/19/2018

Large-Scale Stochastic Sampling from the Probability Simplex

Stochastic gradient Markov chain Monte Carlo (SGMCMC) has become a popul...
research
06/08/2023

Entropy-based Training Methods for Scalable Neural Implicit Sampler

Efficiently sampling from un-normalized target distributions is a fundam...
research
04/22/2023

Posterior Sampling from the Spiked Models via Diffusion Processes

Sampling from the posterior is a key technical problem in Bayesian stati...

Please sign up or login with your details

Forgot password? Click here to reset