DeepAI AI Chat
Log In Sign Up

An iterative coordinate descent algorithm to compute sparse low-rank approximations

by   Cristian Rusu, et al.
Politehnica University of Bucharest

In this paper, we describe a new algorithm to build a few sparse principal components from a given data matrix. Our approach does not explicitly create the covariance matrix of the data and can be viewed as an extension of the Kogbetliantz algorithm to build an approximate singular value decomposition for a few principal components. We show the performance of the proposed algorithm to recover sparse principal components on various datasets from the literature and perform dimensionality reduction for classification applications.


page 1

page 2

page 3

page 4


An iterative Jacobi-like algorithm to compute a few sparse eigenvalue-eigenvector pairs

In this paper, we describe a new algorithm to compute the extreme eigenv...

Multi-Rank Sparse and Functional PCA: Manifold Optimization and Iterative Deflation Techniques

We consider the problem of estimating multiple principal components usin...

Fast computation of the principal components of genotype matrices in Julia

Finding the largest few principal components of a matrix of genetic data...

Sparse Principal Components Analysis: a Tutorial

The topic of this tutorial is Least Squares Sparse Principal Components ...

Estimating Model Uncertainty of Neural Networks in Sparse Information Form

We present a sparse representation of model uncertainty for Deep Neural ...

Steerable Principal Components for Space-Frequency Localized Images

This paper describes a fast and accurate method for obtaining steerable ...

Variable selection and covariance structure identification using loadings

We provide sparse principal loading analysis which is a new concept that...