DeepAI AI Chat
Log In Sign Up

On Projections to Linear Subspaces

09/26/2022
by   Erik Thordsen, et al.
0

The merit of projecting data onto linear subspaces is well known from, e.g., dimension reduction. One key aspect of subspace projections, the maximum preservation of variance (principal component analysis), has been thoroughly researched and the effect of random linear projections on measures such as intrinsic dimensionality still is an ongoing effort. In this paper, we investigate the less explored depths of linear projections onto explicit subspaces of varying dimensionality and the expectations of variance that ensue. The result is a new family of bounds for Euclidean distances and inner products. We showcase the quality of these bounds as well as investigate the intimate relation to intrinsic dimensionality estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/07/2021

HoroPCA: Hyperbolic Dimensionality Reduction via Horospherical Projections

This paper studies Principal Component Analysis (PCA) for data lying in ...
07/19/2021

Van Trees inequality, group equivariance, and estimation of principal subspaces

We establish non-asymptotic lower bounds for the estimation of principal...
02/10/2010

Intrinsic dimension estimation of data by principal component analysis

Estimating intrinsic dimensionality of data is a classic problem in patt...
09/21/2017

Lazy stochastic principal component analysis

Stochastic principal component analysis (SPCA) has become a popular dime...
05/25/2021

Pruned Collapsed Projection-Aggregation Decoding of Reed-Muller Codes

The paper proposes to decode Reed-Muller (RM) codes by projecting onto o...
09/04/2013

Some Options for L1-Subspace Signal Processing

We describe ways to define and calculate L_1-norm signal subspaces which...