DeepAI
Log In Sign Up

Sparse and Low-rank Tensor Estimation via Cubic Sketchings

01/29/2018
by   Botao Hao, et al.
0

In this paper, we propose a general framework for sparse and low-rank tensor estimation from cubic sketchings. A two-stage non-convex implementation is developed based on sparse tensor decomposition and thresholded gradient descent, which ensures exact recovery in the noiseless case and stable recovery in the noisy case with high probability. The non-asymptotic analysis sheds light on an interplay between optimization error and statistical error. The proposed procedure is shown to be rate-optimal under certain conditions. As a technical by-product, novel high-order concentration inequalities are derived for studying high-moment sub-Gaussian tensors. An interesting tensor formulation illustrates the potential application to high-order interaction pursuit in high-dimensional linear regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/12/2022

cuFasterTucker: A Stochastic Optimization Strategy for Parallel Sparse FastTucker Decomposition on GPU Platform

Currently, the size of scientific data is growing at an unprecedented ra...
06/12/2019

Optimal low rank tensor recovery

We investigate the sample size requirement for exact recovery of a high ...
03/16/2021

Generalized Low-rank plus Sparse Tensor Estimation by Fast Riemannian Optimization

We investigate a generalized framework to estimate a latent low-rank plu...
10/10/2018

Learning Tensor Latent Features

We study the problem of learning latent feature models (LFMs) for tensor...
11/09/2019

Tensor Regression Using Low-rank and Sparse Tucker Decompositions

This paper studies a tensor-structured linear regression model with a sc...
08/09/2019

Multivariate Convolutional Sparse Coding with Low Rank Tensor

This paper introduces a new multivariate convolutional sparse coding bas...