Penalty, Shrinkage, and Preliminary Test Estimators under Full Model Hypothesis

03/24/2015
by   Enayetur Raheem, et al.
0

This paper considers a multiple regression model and compares, under full model hypothesis, analytically as well as by simulation, the performance characteristics of some popular penalty estimators such as ridge regression, LASSO, adaptive LASSO, SCAD, and elastic net versus Least Squares Estimator, restricted estimator, preliminary test estimator, and Stein-type estimators when the dimension of the parameter space is smaller than the sample space dimension. We find that RR uniformly dominates LSE, RE, PTE, SE and PRSE while LASSO, aLASSO, SCAD, and EN uniformly dominates LSE only. Further, it is observed that neither penalty estimators nor Stein-type estimator dominate one another.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2015

Improved LASSO

We propose an improved LASSO estimation technique based on Stein-rule. W...
research
11/29/2017

Bayesian Simultaneous Estimation for Means in k Sample Problems

This paper is concerned with the simultaneous estimation of k population...
research
10/08/2019

Penalized regression via the restricted bridge estimator

This article is concerned with the Bridge Regression, which is a special...
research
08/20/2019

Results on standard estimators in the Cox model

We consider the Cox regression model and prove some properties of the ma...
research
09/08/2018

Computational Sufficiency, Reflection Groups, and Generalized Lasso Penalties

We study estimators with generalized lasso penalties within the computat...
research
10/28/2022

A Data Driven Bayesian Graphical Ridge Estimator

Bayesian methodologies prioritising accurate associations above sparsity...
research
08/14/2016

The Spectral Condition Number Plot for Regularization Parameter Determination

Many modern statistical applications ask for the estimation of a covaria...

Please sign up or login with your details

Forgot password? Click here to reset