Gradient Descent for Deep Matrix Factorization: Dynamics and Implicit Bias towards Low Rank

11/27/2020
by   Hung-Hsu Chou, et al.
0

We provide an explicit analysis of the dynamics of vanilla gradient descent for deep matrix factorization in a setting where the minimizer of the loss function is unique. We show that the recovery rate of ground-truth eigenvectors is proportional to the magnitude of the corresponding eigenvalues and that the differences among the rates are amplified as the depth of the factorization increases. For exactly characterized time intervals, the effective rank of gradient descent iterates is provably close to the effective rank of a low-rank projection of the ground-truth matrix, such that early stopping of gradient descent produces regularized solutions that may be used for denoising, for instance. In particular, apart from few initial steps of the iterations, the effective rank of our matrix is monotonically increasing, suggesting that "matrix factorization implicitly enforces gradient descent to take a route in which the effective rank is monotone". Since empirical observations in more general scenarios such as matrix sensing show a similar phenomenon, we believe that our theoretical results shed some light on the still mysterious "implicit bias" of gradient descent in deep learning.

READ FULL TEXT

page 32

page 33

research
05/31/2019

Implicit Regularization in Deep Matrix Factorization

Efforts to understand the generalization mystery in deep learning have l...
research
11/17/2021

How and When Random Feedback Works: A Case Study of Low-Rank Matrix Factorization

The success of gradient descent in ML and especially for learning neural...
research
01/27/2023

Understanding Incremental Learning of Gradient Descent: A Fine-grained Analysis of Matrix Sensing

It is believed that Gradient Descent (GD) induces an implicit bias towar...
research
06/13/2017

Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations

Non-negative matrix factorization is a basic tool for decomposing data i...
research
02/13/2020

Fast Convergence for Langevin Diffusion with Matrix Manifold Structure

In this paper, we study the problem of sampling from distributions of th...
research
11/17/2019

Deep Matrix Factorization with Spectral Geometric Regularization

Deep Matrix Factorization (DMF) is an emerging approach to the problem o...
research
09/21/2022

A Validation Approach to Over-parameterized Matrix and Image Recovery

In this paper, we study the problem of recovering a low-rank matrix from...

Please sign up or login with your details

Forgot password? Click here to reset