Equivalence of Convergence Rates of Posterior Distributions and Bayes Estimators for Functions and Nonparametric Functionals

11/27/2020
by   Zejian Liu, et al.
0

We study the posterior contraction rates of a Bayesian method with Gaussian process priors in nonparametric regression and its plug-in property for differential operators. For a general class of kernels, we establish convergence rates of the posterior measure of the regression function and its derivatives, which are both minimax optimal up to a logarithmic factor for functions in certain classes. Our calculation shows that the rate-optimal estimation of the regression function and its derivatives share the same choice of hyperparameter, indicating that the Bayes procedure remarkably adapts to the order of derivatives and enjoys a generalized plug-in property that extends real-valued functionals to function-valued functionals. This leads to a practically simple method for estimating the regression function and its derivatives, whose finite sample performance is assessed using simulations. Our proof shows that, under certain conditions, to any convergence rate of Bayes estimators there corresponds the same convergence rate of the posterior distributions (i.e., posterior contraction rate), and vice versa. This equivalence holds for a general class of Gaussian processes and covers the regression function and its derivative functionals, under both the L_2 and L_∞ norms. In addition to connecting these two fundamental large sample properties in Bayesian and non-Bayesian regimes, such equivalence enables a new routine to establish posterior contraction rates by calculating convergence rates of nonparametric point estimators. At the core of our argument is an operator-theoretic framework for kernel ridge regression and equivalent kernel techniques. We derive a range of sharp non-asymptotic bounds that are pivotal in establishing convergence rates of nonparametric point estimators and the equivalence theory, which may be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2022

Optimal plug-in Gaussian processes for modelling derivatives

Derivatives are a key nonparametric functional in wide-ranging applicati...
research
06/02/2020

Non-asymptotic Analysis in Kernel Ridge Regression

We develop a general non-asymptotic analysis of learning rates in kernel...
research
08/27/2021

Convergence Rates for Learning Linear Operators from Noisy Data

We study the Bayesian inverse problem of learning a linear operator on a...
research
10/24/2018

Posterior Convergence of Gaussian and General Stochastic Process Regression Under Possible Misspecifications

In this article, we investigate posterior convergence in nonparametric r...
research
10/22/2017

Adaptive Bayesian nonparametric regression using kernel mixture of polynomials with application to partial linear model

We propose a kernel mixture of polynomials prior for Bayesian nonparamet...
research
05/01/2020

Posterior Convergence of Nonparametric Binary and Poisson Regression Under Possible Misspecifications

In this article, we investigate posterior convergence of nonparametric b...
research
04/06/2021

Nonparametric needlet estimation for partial derivatives of a probability density function on the d-torus

This paper is concerned with the estimation of the partial derivatives o...

Please sign up or login with your details

Forgot password? Click here to reset