Posterior contraction for deep Gaussian process priors

05/16/2021
by   Gianluca Finocchio, et al.
0

We study posterior contraction rates for a class of deep Gaussian process priors applied to the nonparametric regression problem under a general composition assumption on the regression function. It is shown that the contraction rates can achieve the minimax convergence rate (up to log n factors), while being adaptive to the underlying structure and smoothness of the target function. The proposed framework extends the Bayesian nonparametrics theory for Gaussian process priors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2022

On the inability of Gaussian process regression to optimally learn compositional functions

We rigorously prove that deep Gaussian process priors can outperform Gau...
research
12/15/2017

A Theoretical Framework for Bayesian Nonparametric Regression: Orthonormal Random Series and Rates of Contraction

We develop a unifying framework for Bayesian nonparametric regression to...
research
11/29/2018

Rates of contraction of posterior distributions based on p-exponential priors

We consider a family of infinite dimensional product measures with tails...
research
12/14/2021

Posterior contraction rates for constrained deep Gaussian processes in density estimation and classication

We provide posterior contraction rates for constrained deep Gaussian pro...
research
08/16/2017

Frequentist coverage and sup-norm convergence rate in Gaussian process regression

Gaussian process (GP) regression is a powerful interpolation technique d...
research
01/19/2023

Semiparametric inference using fractional posteriors

We establish a general Bernstein–von Mises theorem for approximately lin...
research
09/27/2019

Adaptive posterior contraction rates for empirical Bayesian drift estimation of a diffusion

Gaussian process (GP) priors are attractive for estimating the drift of ...

Please sign up or login with your details

Forgot password? Click here to reset