A Solution for Large Scale Nonlinear Regression with High Rank and Degree at Constant Memory Complexity via Latent Tensor Reconstruction

by   Sandor Szedmak, et al.

This paper proposes a novel method for learning highly nonlinear, multivariate functions from examples. Our method takes advantage of the property that continuous functions can be approximated by polynomials, which in turn are representable by tensors. Hence the function learning problem is transformed into a tensor reconstruction problem, an inverse problem of the tensor decomposition. Our method incrementally builds up the unknown tensor from rank-one terms, which lets us control the complexity of the learned model and reduce the chance of overfitting. For learning the models, we present an efficient gradient-based algorithm that can be implemented in linear time in the sample size, order, rank of the tensor and the dimension of the input. In addition to regression, we present extensions to classification, multi-view learning and vector-valued output as well as a multi-layered formulation. The method can work in an online fashion via processing mini-batches of the data with constant memory complexity. Consequently, it can fit into systems equipped only with limited resources such as embedded systems or mobile phones. Our experiments demonstrate a favorable accuracy and running time compared to competing methods.


page 1

page 2

page 3

page 4


JULIA: Joint Multi-linear and Nonlinear Identification for Tensor Completion

Tensor completion aims at imputing missing entries from a partially obse...

A Tensor Rank Theory, Full Rank Tensors and The Sub-Full-Rank Property

A matrix always has a full rank submatrix such that the rank of this mat...

Tensor Decompositions for Modeling Inverse Dynamics

Modeling inverse dynamics is crucial for accurate feedforward robot cont...

Effective Tensor Sketching via Sparsification

In this paper, we investigate effective sketching schemes via sparsifica...

A Tensor Rank Theory and The Sub-Full-Rank Property

One fundamental property in matrix theory is that the rank of a matrix i...

Tensor Reconstruction Beyond Constant Rank

We give reconstruction algorithms for subclasses of depth-3 arithmetic c...

Tensor-on-Tensor Regression: Riemannian Optimization, Over-parameterization, Statistical-computational Gap, and Their Interplay

We study the tensor-on-tensor regression, where the goal is to connect t...

Please sign up or login with your details

Forgot password? Click here to reset