Higher order Matching Pursuit for Low Rank Tensor Learning

03/07/2015
by   Yuning Yang, et al.
0

Low rank tensor learning, such as tensor completion and multilinear multitask learning, has received much attention in recent years. In this paper, we propose higher order matching pursuit for low rank tensor learning problems with a convex or a nonconvex cost function, which is a generalization of the matching pursuit type methods. At each iteration, the main cost of the proposed methods is only to compute a rank-one tensor, which can be done efficiently, making the proposed methods scalable to large scale problems. Moreover, storing the resulting rank-one tensors is of low storage requirement, which can help to break the curse of dimensionality. The linear convergence rate of the proposed methods is established in various circumstances. Along with the main methods, we also provide a method of low computational complexity for approximately computing the rank-one tensors, with provable approximation ratio, which helps to improve the efficiency of the main methods and to analyze the convergence rate. Experimental results on synthetic as well as real datasets verify the efficiency and effectiveness of the proposed methods.

READ FULL TEXT

page 13

page 14

page 16

research
05/28/2020

Enhanced nonconvex low-rank representation for tensor completion

Higher-order low-rank tensor arises in many data processing applications...
research
08/31/2023

Dynamical low-rank tensor approximations to high-dimensional parabolic problems: existence and convergence of spatial discretizations

We consider dynamical low-rank approximations to parabolic problems on h...
research
09/21/2018

Low rank methods for multiple network alignment

Multiple network alignment is the problem of identifying similar and rel...
research
11/03/2016

Cross: Efficient Low-rank Tensor Completion

The completion of tensors, or high-order arrays, attracts significant at...
research
09/26/2019

Shifted and extrapolated power methods for tensor ℓ^p-eigenpairs

This work is concerned with the computation of ℓ^p-eigenvalues and eigen...
research
09/03/2021

Large-Scale Learning with Fourier Features and Tensor Decompositions

Random Fourier features provide a way to tackle large-scale machine lear...
research
01/13/2022

When geometry meets optimization theory: partially orthogonal tensors

Due to the multi-linearity of tensors, most algorithms for tensor optimi...

Please sign up or login with your details

Forgot password? Click here to reset