Predictive Criteria for Prior Selection Using Shrinkage in Linear Models

01/06/2022
by   Dean Dustin, et al.
0

Choosing a shrinkage method can be done by selecting a penalty from a list of pre-specified penalties or by constructing a penalty based on the data. If a list of penalties for a class of linear models is given, we provide comparisons based on sample size and number of non-zero parameters under a predictive stability criterion based on data perturbation. These comparisons provide recommendations for penalty selection in a variety of settings. If the preference is to construct a penalty customized for a given problem, then we propose a technique based on genetic algorithms, again using a predictive criterion. We find that, in general, a custom penalty never performs worse than any commonly used penalties but that there are cases the custom penalty reduces to a recognizable penalty. Since penalty selection is mathematically equivalent to prior selection, our method also constructs priors. The techniques and recommendations we offer are intended for finite sample cases. In this context, we argue that predictive stability under perturbation is one of the few relevant properties that can be invoked when the true model is not known. Nevertheless, we study variable inclusion in simulations and, as part of our shrinkage selection strategy, we include oracle property considerations. In particular, we see that the oracle property typically holds for penalties that satisfy basic regularity conditions and therefore is not restrictive enough to play a direct role in penalty selection. In addition, our real data example also includes considerations merging from model mis-specification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2019

Model Order Selection Based on Information Theoretic Criteria: Design of the Penalty

Information theoretic criteria (ITC) have been widely adopted in enginee...
research
04/08/2014

A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

We describe a simple, efficient, permutation based procedure for selecti...
research
04/19/2022

Choosing the number of factors in factor analysis with incomplete data via a hierarchical Bayesian information criterion

The Bayesian information criterion (BIC), defined as the observed data l...
research
08/16/2018

Sparse Multivariate ARCH Models: Finite Sample Properties

We provide finite sample properties of sparse multivariate ARCH processe...
research
01/21/2021

A General Framework of Online Updating Variable Selection for Generalized Linear Models with Streaming Datasets

In the research field of big data, one of important issues is how to rec...
research
10/31/2020

Smoothly Adaptively Centered Ridge Estimator

With a focus on linear models with smooth functional covariates, we prop...
research
06/22/2021

Doubly Robust Feature Selection with Mean and Variance Outlier Detection and Oracle Properties

We propose a general approach to handle data contaminations that might d...

Please sign up or login with your details

Forgot password? Click here to reset