Implicit Regularization in Matrix Factorization

05/25/2017
by   Suriya Gunasekar, et al.
0

We study implicit regularization when optimizing an underdetermined quadratic objective over a matrix X with gradient descent on a factorization of X. We conjecture and provide empirical and theoretical evidence that with small enough step sizes and initialization close enough to the origin, gradient descent on a full dimensional factorization converges to the minimum nuclear norm solution.

READ FULL TEXT
research
12/17/2020

Towards Resolving the Implicit Bias of Gradient Descent for Matrix Factorization: Greedy Low-Rank Learning

Matrix factorization is a simple and natural test-bed to investigate the...
research
11/17/2019

Deep Matrix Factorization with Spectral Geometric Regularization

Deep Matrix Factorization (DMF) is an emerging approach to the problem o...
research
05/04/2021

Implicit Regularization in Deep Tensor Factorization

Attempts of studying implicit regularization associated to gradient desc...
research
12/26/2017

Algorithmic Regularization in Over-parameterized Matrix Sensing and Neural Networks with Quadratic Activations

We show that the (stochastic) gradient descent algorithm provides an imp...
research
02/24/2021

Noisy Gradient Descent Converges to Flat Minima for Nonconvex Matrix Factorization

Numerous empirical evidences have corroborated the importance of noise i...
research
09/18/2020

Linear Convergence and Implicit Regularization of Generalized Mirror Descent with Time-Dependent Mirrors

The following questions are fundamental to understanding the properties ...
research
08/31/2023

On the Implicit Bias of Adam

In previous literature, backward error analysis was used to find ordinar...

Please sign up or login with your details

Forgot password? Click here to reset