Nonparametric modal regression

12/04/2014
by   Yen-Chi Chen, et al.
0

Modal regression estimates the local modes of the distribution of Y given X=x, instead of the mean, as in the usual regression sense, and can hence reveal important structure missed by usual regression methods. We study a simple nonparametric method for modal regression, based on a kernel density estimate (KDE) of the joint distribution of Y and X. We derive asymptotic error bounds for this method, and propose techniques for constructing confidence sets and prediction sets. The latter is used to select the smoothing bandwidth of the underlying KDE. The idea behind modal regression is connected to many others, such as mixture regression and density ridge estimation, and we discuss these ties as well.

READ FULL TEXT
research
01/22/2019

Modal clustering asymptotics with applications to bandwidth selection

Density-based clustering relies on the idea of linking groups to some sp...
research
06/13/2016

Modal-set estimation with an application to clustering

We present a first procedure that can estimate -- with statistical consi...
research
02/14/2020

An implicit function learning approach for parametric modal regression

For multi-valued functions—such as when the conditional distribution on ...
research
02/20/2017

A Statistical Learning Approach to Modal Regression

This paper studies the nonparametric modal regression problem systematic...
research
04/23/2020

Asymptotic Confidence Regions for Density Ridges

We develop large sample theory including nonparametric confidence region...
research
10/26/2020

Modal clustering of matrix-variate data

The nonparametric formulation of density-based clustering, known as moda...
research
11/19/2022

Bayesian Modal Regression based on Mixture Distributions

Compared to mean regression and quantile regression, the literature on m...

Please sign up or login with your details

Forgot password? Click here to reset