Continuous Herded Gibbs Sampling

06/11/2021
by   Laura M. Wolf, et al.
0

Herding is a technique to sequentially generate deterministic samples from a probability distribution. In this work, we propose a continuous herded Gibbs sampler, that combines kernel herding on continuous densities with Gibbs sampling. Our algorithm allows for deterministically sampling from high-dimensional multivariate probability densities, without directly sampling from the joint density. Experiments with Gaussian mixture densities indicate that the L2 error decreases similarly to kernel herding, while the computation time is significantly lower, i.e., linear in the number of dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/20/2018

A Polynomial Time MCMC Method for Sampling from Continuous DPPs

We study the Gibbs sampling algorithm for continuous determinantal point...
10/20/2021

Sampling from Arbitrary Functions via PSD Models

In many areas of applied statistics and machine learning, generating an ...
07/11/2019

Void-and-Cluster Sampling of Large Scattered Data and Trajectories

We propose a data reduction technique for scattered data based on statis...
03/02/2018

Building a Telescope to Look Into High-Dimensional Image Spaces

An image pattern can be represented by a probability distribution whose ...
05/04/2020

Is the NUTS algorithm correct?

This paper is devoted to investigate whether the popular No U-turn (NUTS...
12/23/2013

Rapid and deterministic estimation of probability densities using scale-free field theories

The question of how best to estimate a continuous probability density fr...
12/18/2021

Multimeasurement Generative Models

We formally map the problem of sampling from an unknown distribution wit...