Maximum Correntropy Criterion Regression models with tending-to-zero scale parameters

10/25/2021
by   Ying Jing, et al.
0

Maximum correntropy criterion regression (MCCR) models have been well studied within the frame of statistical learning when the scale parameters take fixed values or go to infinity. This paper studies the MCCR models with tending-to-zero scale parameters. It is revealed that the optimal learning rate of MCCR models is 𝒪(n^-1) in the asymptotic sense when the sample size n goes to infinity. In the case of finite samples, the performances on robustness of MCCR, Huber and the least square regression models are compared. The applications of these three methods on real data are also displayed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2018

Optimal Designs for Minimax-Criteria in Random Coefficient Regression Models

We consider minimax-optimal designs for the prediction of individual par...
research
09/15/2023

Information Criterion for a Large Scale Subset Regression Models

The information criterion for determining the number of explanatory vari...
research
02/20/2017

A Statistical Learning Approach to Modal Regression

This paper studies the nonparametric modal regression problem systematic...
research
05/11/2015

Foundational principles for large scale inference: Illustrations through correlation mining

When can reliable inference be drawn in the "Big Data" context? This pap...
research
12/10/2018

The Effects of Adaptation on Inference for Non-Linear Regression Models with Normal Errors

In this work, we assume that a response variable is explained by several...
research
03/03/2020

Detecting multiple change points: a PULSE criterion

The research described herewith investigates detecting change points of ...
research
06/19/2020

Information theoretic limits of learning a sparse rule

We consider generalized linear models in regimes where the number of non...

Please sign up or login with your details

Forgot password? Click here to reset