Regularized Step Directions in Conjugate Gradient Minimization for Machine Learning

10/12/2021
by   Cassidy K. Buhler, et al.
0

Conjugate gradient minimization methods (CGM) and their accelerated variants are widely used in machine learning applications. We focus on the use of cubic regularization to improve the CGM direction independent of the steplength (learning rate) computation. Using Shanno's reformulation of CGM as a memoryless BFGS method, we derive new formulas for the regularized step direction, which can be evaluated without additional computational effort. The new step directions are shown to improve iteration counts and runtimes and reduce the need to restart the CGM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2023

Regularized methods via cubic subspace minimization for nonconvex optimization

The main computational cost per iteration of adaptive cubic regularizati...
research
01/24/2011

Reproducing Kernel Banach Spaces with the l1 Norm II: Error Analysis for Regularized Least Square Regression

A typical approach in estimating the learning rate of a regularized lear...
research
06/09/2019

Accelerated Alternating Minimization

Alternating minimization (AM) optimization algorithms have been known fo...
research
06/27/2019

Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization

We focus on minimizing nonconvex finite-sum functions that typically ari...
research
04/07/2020

Automatic, Dynamic, and Nearly Optimal Learning Rate Specification by Local Quadratic Approximation

In deep learning tasks, the learning rate determines the update step siz...
research
02/14/2020

Active set expansion strategies in MPRGP algorithm

The paper investigates strategies for expansion of active set that can b...

Please sign up or login with your details

Forgot password? Click here to reset