Entropy-based Training Methods for Scalable Neural Implicit Sampler

06/08/2023
by   Weijian Luo, et al.
0

Efficiently sampling from un-normalized target distributions is a fundamental problem in scientific computing and machine learning. Traditional approaches like Markov Chain Monte Carlo (MCMC) guarantee asymptotically unbiased samples from such distributions but suffer from computational inefficiency, particularly when dealing with high-dimensional targets, as they require numerous iterations to generate a batch of samples. In this paper, we propose an efficient and scalable neural implicit sampler that overcomes these limitations. Our sampler can generate large batches of samples with low computational costs by leveraging a neural transformation that directly maps easily sampled latent vectors to target samples without the need for iterative procedures. To train the neural implicit sampler, we introduce two novel methods: the KL training method and the Fisher training method. The former minimizes the Kullback-Leibler divergence, while the latter minimizes the Fisher divergence. By employing these training methods, we effectively optimize the neural implicit sampler to capture the desired target distribution. To demonstrate the effectiveness, efficiency, and scalability of our proposed samplers, we evaluate them on three sampling benchmarks with different scales. These benchmarks include sampling from 2D targets, Bayesian inference, and sampling from high-dimensional energy-based models (EBMs). Notably, in the experiment involving high-dimensional EBMs, our sampler produces samples that are comparable to those generated by MCMC-based methods while being more than 100 times more efficient, showcasing the efficiency of our neural sampler. We believe that the theoretical and empirical contributions presented in this work will stimulate further research on developing efficient samplers for various applications beyond the ones explored in this study.

READ FULL TEXT
research
10/07/2020

A Neural Network MCMC sampler that maximizes Proposal Entropy

Markov Chain Monte Carlo (MCMC) methods sample from unnormalized probabi...
research
10/08/2018

Stein Neural Sampler

We propose two novel samplers to produce high-quality samples from a giv...
research
05/16/2012

Efficient Topology-Controlled Sampling of Implicit Shapes

Sampling from distributions of implicitly defined shapes enables analysi...
research
06/21/2021

Schrödinger-Föllmer Sampler: Sampling without Ergodicity

Sampling from probability distributions is an important problem in stati...
research
05/24/2019

A Single SMC Sampler on MPI that Outperforms a Single MCMC Sampler

Markov Chain Monte Carlo (MCMC) is a well-established family of algorith...
research
05/02/2021

Sampling by Divergence Minimization

We introduce a family of Markov Chain Monte Carlo (MCMC) methods designe...
research
05/30/2017

Zonotope hit-and-run for efficient sampling from projection DPPs

Determinantal point processes (DPPs) are distributions over sets of item...

Please sign up or login with your details

Forgot password? Click here to reset