Near Optimal Sketching of Low-Rank Tensor Regression

09/20/2017
by   Jarvis Haupt, et al.
0

We study the least squares regression problem _Θ∈S_ D,RAΘ-b_2, where S_ D,R is the set of Θ for which Θ = ∑_r=1^Rθ_1^(r)∘...∘θ_D^(r) for vectors θ_d^(r)∈R^p_d for all r ∈ [R] and d ∈ [D], and ∘ denotes the outer product of vectors. That is, Θ is a low-dimensional, low-rank tensor. This is motivated by the fact that the number of parameters in Θ is only R ·∑_d=1^D p_d, which is significantly smaller than the ∏_d=1^D p_d number of parameters in ordinary least squares regression. We consider the above CP decomposition model of tensors Θ, as well as the Tucker decomposition. For both models we show how to apply data dimensionality reduction techniques based on sparse random projections Φ∈R^m × n, with m ≪ n, to reduce the problem to a much smaller problem _ΘΦ A Θ - Φ b_2, for which if Θ' is a near-optimum to the smaller problem, then it is also a near optimum to the original problem. We obtain significantly smaller dimension and sparsity in Φ than is possible for ordinary least squares regression, and we also provide a number of numerical simulations supporting our theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/27/2022

CP decomposition and low-rank approximation of antisymmetric tensors

For the antisymmetric tensors the paper examines a low-rank approximatio...
research
06/30/2020

Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition

Conventional algorithms for finding low-rank canonical polyadic (CP) ten...
research
10/28/2020

Sparse Symmetric Tensor Regression for Functional Connectivity Analysis

Tensor regression models, such as CP regression and Tucker regression, h...
research
12/09/2022

Decomposable Sparse Tensor on Tensor Regression

Most regularized tensor regression research focuses on tensors predictor...
research
09/29/2017

Fast online low-rank tensor subspace tracking by CP decomposition using recursive least squares from incomplete observations

We consider the problem of online subspace tracking of a partially obser...
research
07/03/2017

Vectorial Dimension Reduction for Tensors Based on Bayesian Inference

Dimensionality reduction for high-order tensors is a challenging problem...
research
12/17/2019

Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares

In this paper new general modewise Johnson-Lindenstrauss (JL) subspace e...

Please sign up or login with your details

Forgot password? Click here to reset