Universality laws for randomized dimension reduction, with applications

11/30/2015
by   Samet Oymak, et al.
0

Dimension reduction is the process of embedding high-dimensional data into a lower dimensional space to facilitate its analysis. In the Euclidean setting, one fundamental technique for dimension reduction is to apply a random linear map to the data. This dimension reduction procedure succeeds when it preserves certain geometric features of the set. The question is how large the embedding dimension must be to ensure that randomized dimension reduction succeeds with high probability. This paper studies a natural family of randomized dimension reduction maps and a large class of data sets. It proves that there is a phase transition in the success probability of the dimension reduction map as the embedding dimension increases. For a given data set, the location of the phase transition is the same for all maps in this family. Furthermore, each map has the same stability properties, as quantified through the restricted minimum singular value. These results can be viewed as new universality laws in high-dimensional stochastic geometry. Universality laws for randomized dimension reduction have many applications in applied mathematics, signal processing, and statistics. They yield design principles for numerical linear algebra algorithms, for compressed sensing measurement ensembles, and for random linear codes. Furthermore, these results have implications for the performance of statistical estimation methods under a large class of random experimental designs.

READ FULL TEXT

page 24

page 28

research
03/11/2021

Modern Dimension Reduction

Data are not only ubiquitous in society, but are increasingly complex bo...
research
09/24/2021

Dimension Reduction for Data with Heterogeneous Missingness

Dimension reduction plays a pivotal role in analysing high-dimensional d...
research
12/17/2021

Dimension reduction, exact recovery, and error estimates for sparse reconstruction in phase space

An important theme in modern inverse problems is the reconstruction of t...
research
09/30/2020

Facilitate the Parametric Dimension Reduction by Gradient Clipping

We extend a well-known dimension reduction method, t-distributed stochas...
research
04/19/2020

A Dimension-Reduction Model for Brittle Fractures on Thin Shells with Mesh Adaptivity

In this paper we derive a new two-dimensional brittle fracture model for...
research
09/04/2017

Persistent homology for low-complexity models

We show that recent results on randomized dimension reduction schemes th...
research
03/11/2018

Interpreting Deep Classifier by Visual Distillation of Dark Knowledge

Interpreting black box classifiers, such as deep networks, allows an ana...

Please sign up or login with your details

Forgot password? Click here to reset