New Insights into Learning with Correntropy Based Regression

06/19/2020
by   Yunlong Feng, et al.
0

Stemming from information-theoretic learning, the correntropy criterion and its applications to machine learning tasks have been extensively explored and studied. Its application to regression problems leads to the robustness enhanced regression paradigm – namely, correntropy based regression. Having drawn a great variety of successful real-world applications, its theoretical properties have also been investigated recently in a series of studies from a statistical learning viewpoint. The resulting big picture is that correntropy based regression regresses towards the conditional mode function or the conditional mean function robustly under certain conditions. Continuing this trend and going further, in the present study, we report some new insights into this problem. First, we show that under the additive noise regression model, such a regression paradigm can be deduced from minimum distance estimation, implying that the resulting estimator is essentially a minimum distance estimator and thus possesses robustness properties. Second, we show that the regression paradigm, in fact, provides a unified approach to regression problems in that it approaches the conditional mean, the conditional mode, as well as the conditional median functions under certain conditions. Third, we present some new results when it is utilized to learn the conditional mean function by developing its error bounds and exponential convergence rates under conditional (1+ϵ)-moment assumptions. The saturation effect on the established convergence rates, which was observed under (1+ϵ)-moment assumptions, still occurs, indicating the inherent bias of the regression estimator. These novel insights deepen our understanding of correntropy based regression and also enable us to investigate learning schemes induced by other bounded nonconvex loss functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2020

A Statistical Learning Assessment of Huber Regression

As one of the triumphs and milestones of robust statistics, Huber regres...
research
06/18/2019

Nonparametric estimation in a regression model with additive and multiplicative noise

In this paper, we consider an unknown functional estimation problem in a...
research
12/22/2014

An {l_1,l_2,l_∞}-Regularization Approach to High-Dimensional Errors-in-variables Models

Several new estimation methods have been recently proposed for the linea...
research
09/29/2020

A Framework of Learning Through Empirical Gain Maximization

We develop in this paper a framework of empirical gain maximization (EGM...
research
04/07/2021

Minimax Kernel Machine Learning for a Class of Doubly Robust Functionals

A moment function is called doubly robust if it is comprised of two nuis...
research
02/14/2023

Quantiled conditional variance, skewness, and kurtosis by Cornish-Fisher expansion

The conditional variance, skewness, and kurtosis play a central role in ...
research
03/16/2023

On Distributional Autoregression and Iterated Transportation

We consider the problem of defining and fitting models of autoregressive...

Please sign up or login with your details

Forgot password? Click here to reset