Stein Neural Sampler

10/08/2018
by   Tianyang Hu, et al.
0

We propose two novel samplers to produce high-quality samples from a given (un-normalized) probability density. The sampling is achieved by transforming a reference distribution to the target distribution with neural networks, which are trained separately by minimizing two kinds of Stein Discrepancies, and hence our method is named as Stein neural sampler. Theoretical and empirical results suggest that, compared with traditional sampling schemes, our samplers share the following three advantages: 1. Being asymptotically correct; 2. Experiencing less convergence issue in practice; 3. Generating samples instantaneously.

READ FULL TEXT
research
06/08/2023

Entropy-based Training Methods for Scalable Neural Implicit Sampler

Efficiently sampling from un-normalized target distributions is a fundam...
research
07/18/2021

A stepped sampling method for video detection using LSTM

Artificial neural networks that simulate human achieves great successes....
research
11/30/2021

Path Integral Sampler: a stochastic control approach for sampling

We present Path Integral Sampler (PIS), a novel algorithm to draw sample...
research
10/06/2021

Relative Entropy Gradient Sampler for Unnormalized Distributions

We propose a relative entropy gradient sampler (REGS) for sampling from ...
research
04/08/2023

Efficient Multimodal Sampling via Tempered Distribution Flow

Sampling from high-dimensional distributions is a fundamental problem in...
research
06/20/2016

An Empirical Comparison of Sampling Quality Metrics: A Case Study for Bayesian Nonnegative Matrix Factorization

In this work, we empirically explore the question: how can we assess the...
research
10/24/2020

On Testing of Samplers

Given a set of items ℱ and a weight function 𝚠𝚝: ℱ↦ (0,1), the problem o...

Please sign up or login with your details

Forgot password? Click here to reset