Bayesian quadrature for H^1(μ) with Poincaré inequality on a compact interval

07/29/2022
by   Olivier Roustant, et al.
0

Motivated by uncertainty quantification of complex systems, we aim at finding quadrature formulas of the form ∫_a^b f(x) dμ(x) = ∑_i=1^n w_i f(x_i) where f belongs to H^1(μ). Here, μ belongs to a class of continuous probability distributions on [a, b] ⊂ℝ and ∑_i=1^n w_i δ_x_i is a discrete probability distribution on [a, b]. We show that H^1(μ) is a reproducing kernel Hilbert space with a continuous kernel K, which allows to reformulate the quadrature question as a Bayesian (or kernel) quadrature problem. Although K has not an easy closed form in general, we establish a correspondence between its spectral decomposition and the one associated to Poincaré inequalities, whose common eigenfunctions form a T-system (Karlin and Studden, 1966). The quadrature problem can then be solved in the finite-dimensional proxy space spanned by the first eigenfunctions. The solution is given by a generalized Gaussian quadrature, which we call Poincaré quadrature. We derive several results for the Poincaré quadrature weights and the associated worst-case error. When μ is the uniform distribution, the results are explicit: the Poincaré quadrature is equivalent to the midpoint (rectangle) quadrature rule. Its nodes coincide with the zeros of an eigenfunction and the worst-case error scales as b-a/2√(3)n^-1 for large n. By comparison with known results for H^1(0,1), this shows that the Poincaré quadrature is asymptotically optimal. For a general μ, we provide an efficient numerical procedure, based on finite elements and linear programming. Numerical experiments provide useful insights: nodes are nearly evenly spaced, weights are close to the probability density at nodes, and the worst-case error is approximately O(n^-1) for large n.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2020

Integration in reproducing kernel Hilbert spaces of Gaussian kernels

The Gaussian kernel plays a central role in machine learning, uncertaint...
research
05/17/2021

Acceleration of the kernel herding algorithm by improved gradient approximation

Kernel herding is a method used to construct quadrature formulas in a re...
research
02/22/2021

Kernel quadrature by applying a point-wise gradient descent method to discrete energies

We propose a method for generating nodes for kernel quadrature by a poin...
research
08/01/2022

Gauss Quadrature for Freud Weights, Modulation Spaces, and Marcinkiewicz-Zygmund Inequalities

We study Gauss quadrature for Freud weights and derive worst case error ...
research
07/20/2021

Positively Weighted Kernel Quadrature via Subsampling

We study kernel quadrature rules with positive weights for probability m...
research
06/05/2020

Learning from Non-IID Data in Hilbert Spaces: An Optimal Recovery Perspective

The notion of generalization in classical Statistical Learning is often ...
research
06/24/2021

Shallow Representation is Deep: Learning Uncertainty-aware and Worst-case Random Feature Dynamics

Random features is a powerful universal function approximator that inher...

Please sign up or login with your details

Forgot password? Click here to reset