Minimax Optimal Regression over Sobolev Spaces via Laplacian Regularization on Neighborhood Graphs

06/03/2021 ∙ by Alden Green, et al. ∙ 0

In this paper we study the statistical properties of Laplacian smoothing, a graph-based approach to nonparametric regression. Under standard regularity conditions, we establish upper bounds on the error of the Laplacian smoothing estimator f, and a goodness-of-fit test also based on f. These upper bounds match the minimax optimal estimation and testing rates of convergence over the first-order Sobolev class H^1(𝒳), for 𝒳⊆ℝ^d and 1 ≤ d < 4; in the estimation problem, for d = 4, they are optimal modulo a log n factor. Additionally, we prove that Laplacian smoothing is manifold-adaptive: if 𝒳⊆ℝ^d is an m-dimensional manifold with m < d, then the error rate of Laplacian smoothing (in either estimation or testing) depends only on m, in the same way it would if 𝒳 were a full-dimensional set in ℝ^d.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.