
Computing Functions of Random Variables via Reproducing Kernel Hilbert Space Representations
We describe a method to perform functional operations on probability dis...
01/27/2015 ∙ by Bernhard Schölkopf, et al. ∙ 0 ∙ shareread it

OptimallyWeighted Herding is Bayesian Quadrature
Herding and kernel herding are deterministic methods of choosing samples...
08/09/2014 ∙ by Ferenc Huszár, et al. ∙ 0 ∙ shareread it

New Tricks for Estimating Gradients of Expectations
We derive a family of Monte Carlo estimators for gradients of expectatio...
01/31/2019 ∙ by Christian J. Walder, et al. ∙ 0 ∙ shareread it

Hilbert Space Embeddings of Predictive State Representations
Predictive State Representations (PSRs) are an expressive class of model...
09/26/2013 ∙ by Byron Boots, et al. ∙ 0 ∙ shareread it

Autoencoding any Data through Kernel Autoencoders
This paper investigates a novel algorithmic approach to data representat...
05/28/2018 ∙ by Pierre Laforgue, et al. ∙ 0 ∙ shareread it

Hilbert Space Embeddings of POMDPs
A nonparametric approach for policy learning for POMDPs is proposed. The...
10/16/2012 ∙ by Yu Nishiyama, et al. ∙ 0 ∙ shareread it

The connection between Bayesian estimation of a Gaussian random field and RKHS
Reconstruction of a function from noisy data is often formulated as a re...
01/22/2013 ∙ by Aleksandr Y. Aravkin, et al. ∙ 0 ∙ shareread it
SuperSamples from Kernel Herding
We extend the herding algorithm to continuous spaces by using the kernel trick. The resulting "kernel herding" algorithm is an infinite memory deterministic process that learns to approximate a PDF with a collection of samples. We show that kernel herding decreases the error of expectations of functions in the Hilbert space at a rate O(1/T) which is much faster than the usual O(1/pT) for iid random samples. We illustrate kernel herding by approximating Bayesian predictive distributions.
READ FULL TEXT