The Case for Full-Matrix Adaptive Regularization

06/08/2018
by   Naman Agarwal, et al.
0

Adaptive regularization methods come in diagonal and full-matrix variants. However, only the former have enjoyed widespread adoption in training large-scale deep models. This is due to the computational overhead of manipulating a full matrix in high dimension. In this paper, we show how to make full-matrix adaptive regularization practical and useful. We present GGT, a truly scalable full-matrix adaptive optimizer. At the heart of our algorithm is an efficient method for computing the inverse square root of a low-rank matrix. We show that GGT converges to first-order local minima, providing the first rigorous theoretical analysis of adaptive regularization in non-convex optimization. In preliminary experiments, GGT trains faster across a variety of synthetic tasks and standard deep learning benchmarks.

READ FULL TEXT
research
02/07/2023

Sketchy: Memory-efficient Adaptive Regularization with Frequent Directions

Adaptive regularization methods that exploit more than the diagonal entr...
research
09/12/2016

CompAdaGrad: A Compressed, Complementary, Computationally-Efficient Adaptive Gradient Method

The adaptive gradient online learning method known as AdaGrad has seen w...
research
09/12/2016

Non-square matrix sensing without spurious local minima via the Burer-Monteiro approach

We consider the non-square matrix sensing problem, under restricted isom...
research
12/29/2022

A Dynamics Theory of Implicit Regularization in Deep Low-Rank Matrix Factorization

Implicit regularization is an important way to interpret neural networks...
research
09/27/2022

Approximate Secular Equations for the Cubic Regularization Subproblem

The cubic regularization method (CR) is a popular algorithm for unconstr...
research
04/16/2018

Block Mean Approximation for Efficient Second Order Optimization

Advanced optimization algorithms such as Newton method and AdaGrad benef...
research
08/11/2022

Adaptive and Implicit Regularization for Matrix Completion

The explicit low-rank regularization, e.g., nuclear norm regularization,...

Please sign up or login with your details

Forgot password? Click here to reset