Split regression modeling

12/13/2018
by   Anthony Christidis, et al.
0

In this note we study the benefits of splitting variables variables for reducing the variance of linear functions of the regression coefficient estimate. We show that splitting combined with shrinkage can result in estimators with smaller mean squared error compared to popular shrinkage estimators such as Lasso, ridge regression and garrote.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2020

M-estimators of scatter with eigenvalue shrinkage

A popular regularized (shrinkage) covariance estimator is the shrinkage ...
research
10/26/2019

Ridge-type Linear Shrinkage Estimation of the Matrix Mean of High-dimensional Normal Distribution

The estimation of the mean matrix of the multivariate normal distributio...
research
11/30/2020

Tuning in ridge logistic regression to solve separation

Separation in logistic regression is a common problem causing failure of...
research
01/06/2018

Trainable ISTA for Sparse Signal Recovery

In this paper, we propose a novel sparse signal recovery algorithm calle...
research
10/08/2018

Visually Communicating and Teaching Intuition for Influence Functions

Estimators based on influence functions (IFs) have been shown effective ...
research
12/06/2022

On regression and classification with possibly missing response variables in the data

This paper considers the problem of kernel regression and classification...
research
05/25/2021

Group selection and shrinkage with application to sparse semiparametric modeling

Sparse regression and classification estimators capable of group selecti...

Please sign up or login with your details

Forgot password? Click here to reset