Johnson-Lindenstrauss Embeddings with Kronecker Structure

06/24/2021
by   Stefan Bamberger, et al.
0

We prove the Johnson-Lindenstrauss property for matrices Φ D_ξ where Φ has the restricted isometry property and D_ξ is a diagonal matrix containing the entries of a Kronecker product ξ = ξ^(1)⊗…⊗ξ^(d) of d independent Rademacher vectors. Such embeddings have been proposed in recent works for a number of applications concerning compression of tensor structured data, including the oblivious sketching procedure by Ahle et al. for approximate tensor computations. For preserving the norms of p points simultaneously, our result requires Φ to have the restricted isometry property for sparsity C(d) (log p)^d. In the case of subsampled Hadamard matrices, this can improve the dependence of the embedding dimension on p to (log p)^d while the best previously known result required (log p)^d + 1. That is, for the case of d=2 at the core of the oblivious sketching procedure by Ahle et al., the scaling improves from cubic to quadratic. We provide a counterexample to prove that the scaling established in our result is optimal under mild assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2020

The Average-Case Time Complexity of Certifying the Restricted Isometry Property

In compressed sensing, the restricted isometry property (RIP) on M × N s...
research
04/11/2018

On Geodesically Convex Formulations for the Brascamp-Lieb Constant

We consider two non-convex formulations for computing the optimal consta...
research
02/12/2021

Barriers for recent methods in geodesic optimization

We study a class of optimization problems including matrix scaling, matr...
research
03/07/2022

Exponentially faster fixed-parameter algorithms for high-multiplicity scheduling

We consider so-called N-fold integer programs (IPs) of the form max{c^T ...
research
10/17/2021

Faster Algorithms for Bounded-Difference Min-Plus Product

Min-plus product of two n× n matrices is a fundamental problem in algori...
research
02/01/2023

A Nearly-Optimal Bound for Fast Regression with ℓ_∞ Guarantee

Given a matrix A∈ℝ^n× d and a vector b∈ℝ^n, we consider the regression p...
research
09/25/2019

Three Dimensional Sums of Character Gabor Systems

In deterministic compressive sensing, one constructs sampling matrices t...

Please sign up or login with your details

Forgot password? Click here to reset