On the positivity and magnitudes of Bayesian quadrature weights

12/20/2018
by   Toni Karvonen, et al.
0

This article reviews and studies the properties of Bayesian quadrature weights, which strongly affect stability and robustness of the quadrature rule. Specifically, we investigate conditions that are needed to guarantee that the weights are positive or to bound their magnitudes. First, it is shown that the weights are positive in the univariate case if the design points locally minimise the posterior integral variance and the covariance kernel is totally positive (e.g., Gaussian and Hardy kernels). This suggests that gradient-based optimisation of design points may be effective in constructing stable and robust Bayesian quadrature rules. Secondly, we show that magnitudes of the weights admit an upper bound in terms of the fill distance and separation radius if the RKHS of the kernel is a Sobolev space (e.g., Matérn kernels), suggesting that quasi-uniform points should be used. A number of numerical examples demonstrate that significant generalisations and improvements appear to be possible, manifesting the need for further research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/20/2021

Positively Weighted Kernel Quadrature via Subsampling

We study kernel quadrature rules with positive weights for probability m...
research
02/06/2018

Near-Optimal Coresets of Kernel Density Estimates

We construct near-optimal coresets for kernel density estimate for point...
research
02/24/2015

On the Equivalence between Kernel Quadrature Rules and Random Feature Expansions

We show that kernel-based quadrature rules for computing integrals can b...
research
09/01/2017

Convergence Analysis of Deterministic Kernel-Based Quadrature Rules in Misspecified Settings

This paper presents convergence analysis of kernel-based quadrature rule...
research
05/09/2019

A Novel Adaptive Kernel for the RBF Neural Networks

In this paper, we propose a novel adaptive kernel for the radial basis f...
research
02/15/2018

Covariance Function Pre-Training with m-Kernels for Accelerated Bayesian Optimisation

The paper presents a novel approach to direct covariance function learni...
research
11/12/2018

The doctrinal paradox: ROC analysis in a probabilistic framework

The doctrinal paradox is analysed from a probabilistic point of view ass...

Please sign up or login with your details

Forgot password? Click here to reset