Monotonicity preservation properties of kernel regression estimators

07/03/2020
by   Iosif Pinelis, et al.
0

Three common classes of kernel regression estimators are considered: the Nadaraya–Watson (NW) estimator, the Priestley–Chao (PC) estimator, and the Gasser–Müller (GM) estimator. It is shown that (i) the GM estimator has a certain monotonicity preservation property for any kernel K, (ii) the NW estimator has this property if and only the kernel K is log concave, and (iii) the PC estimator does not have this property for any kernel K. Other related properties of these regression estimators are discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2020

A Kernel-based Consensual Regression Aggregation Method

In this article, we introduce a kernel-based consensual aggregation meth...
research
06/13/2020

Kernel Selection in Nonparametric Regression

In the regression model Y = b(X) +ε, where X has a density f, this paper...
research
03/29/2019

Data Amplification: A Unified and Competitive Approach to Property Estimation

Estimating properties of discrete distributions is a fundamental problem...
research
12/06/2021

Minimax properties of Dirichlet kernel density estimators

This paper is concerned with the asymptotic behavior in β-Hölder spaces ...
research
05/28/2020

Boundary-free Estimators of the Mean Residual Life Function by Transformation

We propose two new kernel-type estimators of the mean residual life func...
research
11/25/2017

An Oracle Property of The Nadaraya-Watson Kernel Estimator for High Dimensional Nonparametric Regression

The celebrated Nadaraya-Watson kernel estimator is among the most studie...
research
05/26/2022

Proximal Estimation and Inference

We build a unifying convex analysis framework characterizing the statist...

Please sign up or login with your details

Forgot password? Click here to reset