An Oracle Property of The Nadaraya-Watson Kernel Estimator for High Dimensional Nonparametric Regression

11/25/2017
by   Daniel Conn, et al.
0

The celebrated Nadaraya-Watson kernel estimator is among the most studied method for nonparametric regression. A classical result is that its rate of convergence depends on the number of covariates and deteriorates quickly as the dimension grows, which underscores the "curse of dimensionality" and has limited its use in high dimensional settings. In this article, we show that when the true regression function is single or multi-index, the effects of the curse of dimensionality may be mitigated for the Nadaraya-Watson kernel estimator. Specifically, we prove that with K-fold cross-validation, the Nadaraya-Watson kernel estimator indexed by a positive semidefinite bandwidth matrix has an oracle property that its rate of convergence depends on the number of indices of the regression function rather than the number of covariates. Intuitively, this oracle property is a consequence of allowing the bandwidths to diverge to infinity as opposed to restricting them all to converge to zero at certain rates as done in previous theoretical studies. Our result provides a theoretical perspective for the use of kernel estimation in high dimensional nonparametric regression and other applications such as metric learning when a low rank structure is anticipated. Numerical illustrations are given through simulations and real data examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2020

Kernel Selection in Nonparametric Regression

In the regression model Y = b(X) +ε, where X has a density f, this paper...
research
11/23/2021

MARS via LASSO

MARS is a popular method for nonparametric regression introduced by Frie...
research
07/03/2020

Monotonicity preservation properties of kernel regression estimators

Three common classes of kernel regression estimators are considered: the...
research
09/29/2022

Uniform convergence rates and automatic variabel selection in nonparametric regression with functional and categorical covariates

In Selk and Gertheiss (2022) a nonparametric prediction method for model...
research
02/01/2016

A Spectral Series Approach to High-Dimensional Nonparametric Regression

A key question in modern statistics is how to make fast and reliable inf...
research
08/29/2019

Deep Learning and MARS: A Connection

We consider least squares regression estimates using deep neural network...
research
07/20/2018

Escaping the Curse of Dimensionality in Similarity Learning: Efficient Frank-Wolfe Algorithm and Generalization Bounds

Similarity and metric learning provides a principled approach to constru...

Please sign up or login with your details

Forgot password? Click here to reset