Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares

12/17/2019
by   M. A. Iwen, et al.
0

In this paper new general modewise Johnson-Lindenstrauss (JL) subspace embeddings are proposed that are both considerably faster to generate and easier to store than traditional JL embeddings when working with extremely large vectors and/or tensors. Corresponding embedding results are then proven for two different types of low-dimensional (tensor) subspaces. The first of these new subspace embedding results produces improved space complexity bounds for embeddings of rank-r tensors whose CP decompositions are contained in the span of a fixed (but unknown) set of r rank-one basis tensors. In the traditional vector setting this first result yields new and very general near-optimal oblivious subspace embedding constructions that require fewer random bits to generate than standard JL embeddings when embedding subspaces of C^N spanned by basis vectors with special Kronecker structure. The second result proven herein provides new fast JL embeddings of arbitrary r-dimensional subspaces S⊂C^N which also require fewer random bits (and so are easier to store - i.e., require less space) than standard fast JL embedding methods in order to achieve small ϵ-distortions. These new oblivious subspace embedding results work by (i) effectively folding any given vector in S into a (not necessarily low-rank) tensor, and then (ii) embedding the resulting tensor into C^m for m ≤ C r log^c(N) / ϵ^2. Applications related to compression and fast compressed least squares solution methods are also considered, including those used for fitting low-rank CP decompositions, and the proposed JL embedding results are shown to work well numerically in both settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2021

Fast and Accurate Randomized Algorithms for Low-rank Tensor Decompositions

Low-rank Tucker and CP tensor decompositions are powerful tools in data ...
research
09/29/2017

Fast online low-rank tensor subspace tracking by CP decomposition using recursive least squares from incomplete observations

We consider the problem of online subspace tracking of a partially obser...
research
06/30/2020

Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition

Conventional algorithms for finding low-rank canonical polyadic (CP) ten...
research
10/05/2020

Subspace Embeddings Under Nonlinear Transformations

We consider low-distortion embeddings for subspaces under entrywise nonl...
research
09/11/2019

Faster Johnson-Lindenstrauss Transforms via Kronecker Products

The Kronecker product is an important matrix operation with a wide range...
research
09/20/2017

Near Optimal Sketching of Low-Rank Tensor Regression

We study the least squares regression problem _Θ∈S_ D,RAΘ-b_2, where S_...
research
06/18/2017

Sample, computation vs storage tradeoffs for classification using tensor subspace models

In this paper, we exhibit the tradeoffs between the (training) sample, c...

Please sign up or login with your details

Forgot password? Click here to reset