Variable selection and covariance structure identification using loadings

11/29/2022
by   Jan O. Bauer, et al.
0

We provide sparse principal loading analysis which is a new concept that reduces dimensionality of cross sectional data and identifies the underlying covariance structure. Sparse principal loading analysis selects a subset of existing variables for dimensionality reduction while variables that have a small distorting effect on the covariance matrix are discarded. Therefore, we show how to detect these variables and provide methods to assess their magnitude of distortion. Sparse principal loading analysis is twofold and can also identify the underlying block diagonal covariance structure using sparse loadings. This is a new approach in this context and we provide a required criterion to evaluate if the found block-structure fits the sample. The method uses sparse loadings rather than eigenvectors to decompose the covariance matrix which can result in a large loss of information if the loadings of choice are too sparse. However, we show that this is no concern in our new concept because sparseness is controlled by the aforementioned evaluation criterion. Further, we show the advantages of sparse principal loading analysis both in the context of variable selection and covariance structure detection, and illustrate the performance of the method with simulations and on real datasets. Supplementary material for this article is available online.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2021

Correlation Based Principal Loading Analysis

Principal loading analysis is a dimension reduction method that discards...
research
10/01/2018

Integrated Principal Components Analysis

Data integration, or the strategic analysis of multiple sources of data ...
research
03/11/2021

Overlap of OLS Regression and Principal Loading Analysis

Principal loading analysis is a dimension reduction method that discards...
research
07/30/2021

An iterative coordinate descent algorithm to compute sparse low-rank approximations

In this paper, we describe a new algorithm to build a few sparse princip...
research
06/05/2021

Principal Bit Analysis: Autoencoding with Schur-Concave Loss

We consider a linear autoencoder in which the latent variables are quant...
research
08/31/2011

Anisotropic k-Nearest Neighbor Search Using Covariance Quadtree

We present a variant of the hyper-quadtree that divides a multidimension...
research
08/24/2021

Quantification of intrinsic quality of a principal dimension in correspondence analysis and taxicab correspondence analysis

Collins(2002, 2011) raised a number of issues with regards to correspond...

Please sign up or login with your details

Forgot password? Click here to reset