Sub-Gaussian Matrices on Sets: Optimal Tail Dependence and Applications

01/28/2020
by   Halyun Jeong, et al.
0

Random linear mappings are widely used in modern signal processing, compressed sensing and machine learning. These mappings may be used to embed the data into a significantly lower dimension while at the same time preserving useful information. This is done by approximately preserving the distances between data points, which are assumed to belong to R^n. Thus, the performance of these mappings is usually captured by how close they are to an isometry on the data. Random Gaussian linear mappings have been the object of much study, while the sub-Gaussian settings is not yet fully understood. In the latter case, the performance depends on the sub-Gaussian norm of the rows. In many applications, e.g., compressed sensing, this norm may be large, or even growing with dimension, and thus it is important to characterize this dependence. We study when a sub-Gaussian matrix can become a near isometry on a set, show that previous best known dependence on the sub-Gaussian norm was sub-optimal, and present the optimal dependence. Our result not only answers a remaining question posed by Liaw, Mehrabian, Plan and Vershynin in 2017, but also generalizes their work. We also develop a new Bernstein type inequality for sub-exponential random variables, and a new Hanson-Wright inequality for quadratic forms of sub-Gaussian random variables, in both cases improving the bounds in the sub-Gaussian regime under moment constraints. Finally, we illustrate popular applications such as Johnson-Lindenstrauss embeddings, randomized sketches and blind demodulation, whose theoretical guarantees can be improved by our results in the sub-Gaussian case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2021

Dictionary-Sparse Recovery From Heavy-Tailed Measurements

The recovery of signals that are sparse not in a basis, but rather spars...
research
10/21/2018

A Non-asymptotic, Sharp, and User-friendly Reverse Chernoff-Cramèr Bound

The Chernoff-Cramèr bound is a widely used technique to analyze the uppe...
research
01/02/2021

New-Type Hoeffding's Inequalities and Application in Tail Bounds

It is well known that Hoeffding's inequality has a lot of applications i...
research
10/30/2021

Beyond Independent Measurements: General Compressed Sensing with GNN Application

We consider the problem of recovering a structured signal 𝐱∈ℝ^n from noi...
research
08/20/2020

Simple Analysis of Johnson-Lindenstrauss Transform under Neuroscience Constraints

The paper re-analyzes a version of the celebrated Johnson-Lindenstrauss ...
research
07/15/2023

Bulk Johnson-Lindenstrauss Lemmas

For a set X of N points in ℝ^D, the Johnson-Lindenstrauss lemma provides...
research
10/11/2019

Random Quadratic Forms with Dependence: Applications to Restricted Isometry and Beyond

Several important families of computational and statistical results in m...

Please sign up or login with your details

Forgot password? Click here to reset