Almost Optimal Tensor Sketch

09/03/2019
by   Thomas D. Ahle, et al.
0

We construct a matrix M∈ R^m⊗ d^c with just m=O(c λ ε^-2polylog1/εδ) rows, which preserves the norm Mx_2=(1±ε)x_2 of all x in any given λ dimensional subspace of R^d with probability at least 1-δ. This matrix can be applied to tensors x^(1)⊗...⊗ x^(c)∈ R^d^c in O(c m min{d,m}) time – hence the name "Tensor Sketch". (Here x⊗ y = asvec(xy^T) = [x_1y_1, x_1y_2,...,x_1y_m,x_2y_1,...,x_ny_m]∈ R^nm.) This improves upon earlier Tensor Sketch constructions by Pagh and Pham [TOCT 2013, SIGKDD 2013] and Avron et al. [NIPS 2014] which require m=Ω(3^cλ^2δ^-1) rows for the same guarantees. The factors of λ, ε^-2 and log1/δ can all be shown to be necessary making our sketch optimal up to log factors. With another construction we get λ times more rows m=Õ(c λ^2 ε^-2(log1/δ)^3), but the matrix can be applied to any vector x^(1)⊗...⊗ x^(c)∈ R^d^c in just Õ(c (d+m)) time. This matches the application time of Tensor Sketch while still improving the exponential dependencies in c and log1/δ. Technically, we show two main lemmas: (1) For many Johnson Lindenstrauss (JL) constructions, if Q,Q'∈ R^m× d are independent JL matrices, the element-wise product Qx ∘ Q'y equals M(x⊗ y) for some M∈ R^m× d^2 which is itself a JL matrix. (2) If M^(i)∈ R^m× md are independent JL matrices, then M^(1)(x ⊗ (M^(2)y ⊗...)) = M(x⊗ y⊗...) for some M∈ R^m× d^c which is itself a JL matrix. Combining these two results give an efficient sketch for tensors of any size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2019

Multi-dimensional Tensor Sketch

Sketching refers to a class of randomized dimensionality reduction metho...
research
01/03/2022

Sketch-and-project methods for tensor linear systems

We first extend the famous sketch-and-project method and its adaptive va...
research
05/08/2023

Fast randomized algorithms for computing the generalized tensor SVD based on the tubal product

This work deals with developing two fast randomized algorithms for compu...
research
04/13/2022

Sketching Algorithms and Lower Bounds for Ridge Regression

We give a sketching-based iterative algorithm that computes 1+ε approxim...
research
03/07/2019

A Rank-1 Sketch for Matrix Multiplicative Weights

We show that a simple randomized sketch of the matrix multiplicative wei...
research
11/02/2017

Efficient O(n/ε) Spectral Sketches for the Laplacian and its Pseudoinverse

In this paper we consider the problem of efficiently computing ϵ-sketche...
research
02/01/2023

A Nearly-Optimal Bound for Fast Regression with ℓ_∞ Guarantee

Given a matrix A∈ℝ^n× d and a vector b∈ℝ^n, we consider the regression p...

Please sign up or login with your details

Forgot password? Click here to reset