Optimal plug-in Gaussian processes for modelling derivatives

10/20/2022
by   Zejian Liu, et al.
0

Derivatives are a key nonparametric functional in wide-ranging applications where the rate of change of an unknown function is of interest. In the Bayesian paradigm, Gaussian processes (GPs) are routinely used as a flexible prior for unknown functions, and are arguably one of the most popular tools in many areas. However, little is known about the optimal modelling strategy and theoretical properties when using GPs for derivatives. In this article, we study a plug-in strategy by differentiating the posterior distribution with GP priors for derivatives of any order. This practically appealing plug-in GP method has been previously perceived as suboptimal and degraded, but this is not necessarily the case. We provide posterior contraction rates for plug-in GPs and establish that they remarkably adapt to derivative orders. We show that the posterior measure of the regression function and its derivatives, with the same choice of hyperparameter that does not depend on the order of derivatives, converges at the minimax optimal rate up to a logarithmic factor for functions in certain classes. This to the best of our knowledge provides the first positive result for plug-in GPs in the context of inferring derivative functionals, and leads to a practically simple nonparametric Bayesian method with guided hyperparameter tuning for simultaneously estimating the regression function and its derivatives. Simulations show competitive finite sample performance of the plug-in GP method. A climate change application on analyzing the global sea-level rise is discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2020

Equivalence of Convergence Rates of Posterior Distributions and Bayes Estimators for Functions and Nonparametric Functionals

We study the posterior contraction rates of a Bayesian method with Gauss...
research
03/31/2017

Exploiting gradients and Hessians in Bayesian optimization and Bayesian quadrature

An exciting branch of machine learning research focuses on methods for l...
research
08/16/2017

Frequentist coverage and sup-norm convergence rate in Gaussian process regression

Gaussian process (GP) regression is a powerful interpolation technique d...
research
09/17/2019

Compositional uncertainty in deep Gaussian processes

Gaussian processes (GPs) are nonparametric priors over functions, and fi...
research
10/26/2021

Non-Gaussian Gaussian Processes for Few-Shot Regression

Gaussian Processes (GPs) have been widely used in machine learning to mo...
research
03/05/2020

SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for Gaussian Process Regression with Derivatives

Gaussian processes are an important regression tool with excellent analy...
research
03/08/2019

Active learning for enumerating local minima based on Gaussian process derivatives

We study active learning (AL) based on Gaussian Processes (GPs) for effi...

Please sign up or login with your details

Forgot password? Click here to reset