Beyond Lazy Training for Over-parameterized Tensor Decomposition

10/22/2020
by   Xiang Wang, et al.
0

Over-parametrization is an important technique in training neural networks. In both theory and practice, training a larger network allows the optimization algorithm to avoid bad local optimal solutions. In this paper we study a closely related tensor decomposition problem: given an l-th order tensor in (R^d)^⊗ l of rank r (where r≪ d), can variants of gradient descent find a rank m decomposition where m > r? We show that in a lazy training regime (similar to the NTK regime for neural networks) one needs at least m = Ω(d^l-1), while a variant of gradient descent can find an approximate tensor when m = O^*(r^2.5llog d). Our results show that gradient descent on over-parametrized objective could go beyond the lazy training regime and utilize certain low-rank structure in the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2023

Experimental observation on a low-rank tensor model for eigenvalue problems

Here we utilize a low-rank tensor model (LTM) as a function approximator...
research
06/11/2021

Understanding Deflation Process in Over-parametrized Tensor Decomposition

In this paper we study the training dynamics for gradient flow on over-p...
research
06/28/2015

Beating the Perils of Non-Convexity: Guaranteed Training of Neural Networks using Tensor Methods

Training neural networks is a challenging non-convex optimization proble...
research
01/16/2013

Big Neural Networks Waste Capacity

This article exposes the failure of some big neural networks to leverage...
research
08/22/2023

Low Tensor Rank Learning of Neural Dynamics

Learning relies on coordinated synaptic changes in recurrently connected...
research
01/03/2018

Gradient-based Optimization for Regression in the Functional Tensor-Train Format

We consider the task of low-multilinear-rank functional regression, i.e....
research
02/24/2017

Strongly-Typed Agents are Guaranteed to Interact Safely

As artificial agents proliferate, it is becoming increasingly important ...

Please sign up or login with your details

Forgot password? Click here to reset