Log In Sign Up

Sparse and Low-rank Tensor Estimation via Cubic Sketchings

by   Botao Hao, et al.

In this paper, we propose a general framework for sparse and low-rank tensor estimation from cubic sketchings. A two-stage non-convex implementation is developed based on sparse tensor decomposition and thresholded gradient descent, which ensures exact recovery in the noiseless case and stable recovery in the noisy case with high probability. The non-asymptotic analysis sheds light on an interplay between optimization error and statistical error. The proposed procedure is shown to be rate-optimal under certain conditions. As a technical by-product, novel high-order concentration inequalities are derived for studying high-moment sub-Gaussian tensors. An interesting tensor formulation illustrates the potential application to high-order interaction pursuit in high-dimensional linear regression.


page 1

page 2

page 3

page 4


cuFasterTucker: A Stochastic Optimization Strategy for Parallel Sparse FastTucker Decomposition on GPU Platform

Currently, the size of scientific data is growing at an unprecedented ra...

Optimal low rank tensor recovery

We investigate the sample size requirement for exact recovery of a high ...

Generalized Low-rank plus Sparse Tensor Estimation by Fast Riemannian Optimization

We investigate a generalized framework to estimate a latent low-rank plu...

Learning Tensor Latent Features

We study the problem of learning latent feature models (LFMs) for tensor...

Tensor Regression Using Low-rank and Sparse Tucker Decompositions

This paper studies a tensor-structured linear regression model with a sc...

Multivariate Convolutional Sparse Coding with Low Rank Tensor

This paper introduces a new multivariate convolutional sparse coding bas...