An Upper Bound of the Bias of Nadaraya-Watson Kernel Regression under Lipschitz Assumptions

01/29/2020
by   Samuele Tosatto, et al.
0

The Nadaraya-Watson kernel estimator is among the most popular nonparameteric regression technique thanks to its simplicity. Its asymptotic bias has been studied by Rosenblatt in 1969 and has been reported in a number of related literature. However, Rosenblatt's analysis is only valid for infinitesimal bandwidth. In contrast, we propose in this paper an upper bound of the bias which holds for finite bandwidths. Moreover, contrarily to the classic analysis we allow for discontinuous first order derivative of the regression function, we extend our bounds for multidimensional domains and we include the knowledge of the bound of the regression function when it exists and if it is known, to obtain a tighter bound. We believe that this work has potential applications in those fields where some hard guarantees on the error are needed

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/12/2021

Variable bandwidth kernel regression estimation

In this paper we propose a variable bandwidth kernel regression estimato...
research
02/11/2018

Rederiving the Upper Bound for Halving Edges using Cardano's Formula

In this paper we rederive an old upper bound on the number of halving ed...
research
05/06/2022

Improvements of Polya Upper Bound for Cumulative Standard Normal Distribution and Related Functions

Although there is an extensive literature on the upper bound for cumulat...
research
09/24/2019

Simple and Almost Assumption-Free Out-of-Sample Bound for Random Feature Mapping

Random feature mapping (RFM) is a popular method for speeding up kernel ...
research
02/24/2015

On the Equivalence between Kernel Quadrature Rules and Random Feature Expansions

We show that kernel-based quadrature rules for computing integrals can b...
research
09/16/2020

A priori guarantees of finite-time convergence for Deep Neural Networks

In this paper, we perform Lyapunov based analysis of the loss function t...

Please sign up or login with your details

Forgot password? Click here to reset