Optimally-Weighted Herding is Bayesian Quadrature

08/09/2014
by   Ferenc Huszár, et al.
0

Herding and kernel herding are deterministic methods of choosing samples which summarise a probability distribution. A related task is choosing samples for estimating integrals using Bayesian quadrature. We show that the criterion minimised when selecting samples in kernel herding is equivalent to the posterior variance in Bayesian quadrature. We then show that sequential Bayesian quadrature can be viewed as a weighted version of kernel herding which achieves performance superior to any other weighted herding method. We demonstrate empirically a rate of convergence faster than O(1/N). Our results also imply an upper bound on the empirical error of the Bayesian quadrature estimate.

READ FULL TEXT
research
03/15/2012

Super-Samples from Kernel Herding

We extend the herding algorithm to continuous spaces by using the kernel...
research
07/01/2020

Bayesian Coresets: An Optimization Perspective

Bayesian coresets have emerged as a promising approach for implementing ...
research
01/19/2020

Distributionally Robust Bayesian Quadrature Optimization

Bayesian quadrature optimization (BQO) maximizes the expectation of an e...
research
09/20/2019

Sequential Ensemble Transform for Bayesian Inverse Problems

We present the Sequential Ensemble Transform (SET) method, a new approac...
research
10/13/2022

Variance-Aware Estimation of Kernel Mean Embedding

An important feature of kernel mean embeddings (KME) is that the rate of...
research
03/31/2023

New estimation of Sobol' indices using kernels

In this work, we develop an approach mentioned by da Veiga and Gamboa in...
research
11/15/2021

Where to Drill Next? A Dual-Weighted Approach to Adaptive Optimal Design of Groundwater Surveys

We present a novel approach to adaptive optimal design of groundwater su...

Please sign up or login with your details

Forgot password? Click here to reset