DeepAI AI Chat
Log In Sign Up

Minimax Optimal Regression over Sobolev Spaces via Laplacian Regularization on Neighborhood Graphs

by   Alden Green, et al.

In this paper we study the statistical properties of Laplacian smoothing, a graph-based approach to nonparametric regression. Under standard regularity conditions, we establish upper bounds on the error of the Laplacian smoothing estimator f, and a goodness-of-fit test also based on f. These upper bounds match the minimax optimal estimation and testing rates of convergence over the first-order Sobolev class H^1(𝒳), for 𝒳⊆ℝ^d and 1 ≤ d < 4; in the estimation problem, for d = 4, they are optimal modulo a log n factor. Additionally, we prove that Laplacian smoothing is manifold-adaptive: if 𝒳⊆ℝ^d is an m-dimensional manifold with m < d, then the error rate of Laplacian smoothing (in either estimation or testing) depends only on m, in the same way it would if 𝒳 were a full-dimensional set in ℝ^d.


page 1

page 2

page 3

page 4


Minimax Optimal Regression over Sobolev Spaces via Laplacian Eigenmaps on Neighborhood Graphs

In this paper we study the statistical properties of Principal Component...

Optimal Nonparametric Inference via Deep Neural Network

Deep neural network is a state-of-art method in modern science and techn...

Rates of Convergence for Regression with the Graph Poly-Laplacian

In the (special) smoothing spline problem one considers a variational pr...

Optimal Bayesian Smoothing of Functional Observations over a Large Graph

In modern contexts, some types of data are observed in high-resolution, ...

How Many Machines Can We Use in Parallel Computing for Kernel Ridge Regression?

This paper attempts to solve a basic problem in distributed statistical ...

Total Variation Classes Beyond 1d: Minimax Rates, and the Limitations of Linear Smoothers

We consider the problem of estimating a function defined over n location...

Minimax Rates for Homology Inference

Often, high dimensional data lie close to a low-dimensional submanifold ...