Non-parametric Quantile Regression via the K-NN Fused Lasso

12/03/2020
by   Steven Siwei Ye, et al.
0

Quantile regression is a statistical method for estimating conditional quantiles of a response variable. In addition, for mean estimation, it is well known that quantile regression is more robust to outliers than l_2-based methods. By using the fused lasso penalty over a K-nearest neighbors graph, we propose an adaptive quantile estimator in a non-parametric setup. We show that the estimator attains optimal rate of n^-1/d up to a logarithmic factor, under mild assumptions on the data generation mechanism of the d-dimensional data. We develop algorithms to compute the estimator and discuss methodology for model selection. Numerical experiments on simulated and real data demonstrate clear advantages of the proposed estimator over state of the art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2018

Adaptive Non-Parametric Regression With the K-NN Fused Lasso

The fused lasso, also known as total-variation denoising, is a locally-a...
research
12/20/2022

Probabilistic quantile factor analysis

This paper extends quantile factor analysis to a probabilistic variant t...
research
03/08/2022

Element-wise Estimation Error of Generalized Fused Lasso

The main result of this article is that we obtain an elementwise error b...
research
02/08/2019

Censored Quantile Regression Forests

Random forests are powerful non-parametric regression method but are sev...
research
01/08/2020

Censored Quantile Regression Forest

Random forests are powerful non-parametric regression method but are sev...
research
10/16/2021

Quantile Regression by Dyadic CART

In this paper we propose and study a version of the Dyadic Classificatio...
research
12/18/2020

Flexible, Non-parametric Modeling Using Regularized Neural Networks

Non-parametric regression, such as generalized additive models (GAMs), i...

Please sign up or login with your details

Forgot password? Click here to reset