Information Complexity Criterion for Model Selection in Robust Regression Using A New Robust Penalty Term

12/04/2020
by   Esra Pamukçu, et al.
0

Model selection is basically a process of finding the best model from the subset of models in which the explanatory variables are effective on the response variable. The log likelihood function for the lack of fit term and a specified penalty term are used as two parts in a model selection criteria. In this paper, we derive a new tool for the model selection in robust regression. We introduce a new definition of relative entropy based on objective functions. Due to the analytical simplicity, we use Huber's objective function ρ_H and propose our specified penalty term C_0^ρ_H to derive new Information Complexity Criterion (RICOMP_C_0^ρ_H) as a robust model selection tool. Additionally, by using the properties of C_0^ρ_H, we propose a new value of tuning parameter called k_C_0 for the Huber's ρ_H. If a contamination to normal distribution exists, RICOMP_C_0^ρ_H chooses the true model better than the rival ones. Monte Carlo Simulation studies are carried out to show the utility both of k_C_0 and RICOMP_C_0^ρ_H. A real data example is also given.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2019

Law of the Iterated Logarithm and Model Selection Consistency for Independent and Dependent GLMs

We study the law of the iterated logarithm (LIL) for the maximum likelih...
research
01/19/2017

Parameter Selection Algorithm For Continuous Variables

In this article, we propose a new algorithm for supervised learning meth...
research
06/27/2019

The exact form of the 'Ockham factor' in model selection

We unify the Bayesian and Frequentist justifications for model selection...
research
08/09/2012

Algorithmic Simplicity and Relevance

The human mind is known to be sensitive to complexity. For instance, the...
research
07/09/2022

A Statistically-Based Approach to Feedforward Neural Network Model Selection

Feedforward neural networks (FNNs) can be viewed as non-linear regressio...
research
06/08/2022

A simple data-driven method to optimise the penalty strengths of penalised models and its application to non-parametric smoothing

Information of interest can often only be extracted from data by model f...
research
09/15/2023

Information Criterion for a Large Scale Subset Regression Models

The information criterion for determining the number of explanatory vari...

Please sign up or login with your details

Forgot password? Click here to reset