DeepAI AI Chat
Log In Sign Up

Fast Gradient Methods with Alignment for Symmetric Linear Systems without Using Cauchy Step

09/03/2019
by   Qinmeng Zou, et al.
0

The performance of gradient methods has been considerably improved by the introduction of delayed parameters. After two and a half decades, the revealing of second-order information has recently given rise to the Cauchy-based methods with alignment, which reduce asymptotically the search spaces in smaller and smaller dimensions. They are generally considered as the state of the art of gradient methods. This paper reveals the spectral properties of minimal gradient and asymptotically optimal steps, and then suggests three fast methods with alignment without using the Cauchy step. The convergence results are provided, and numerical experiments show that the new methods provide competitive and more stable alternatives to the classical Cauchy-based methods. In particular, alignment gradient methods present advantages over the Krylov subspace methods in some situations, which makes them attractive in practice.

READ FULL TEXT
10/26/2020

Convergence Acceleration via Chebyshev Step: Plausible Interpretation of Deep-Unfolded Gradient Descent

Deep unfolding is a promising deep-learning technique, whose network arc...
02/20/2020

Second Order Optimization Made Practical

Optimization in machine learning, both theoretical and applied, is prese...
08/23/2022

A Stochastic Variance Reduced Gradient using Barzilai-Borwein Techniques as Second Order Information

In this paper, we consider to improve the stochastic variance reduce gra...
02/05/2021

In-Loop Meta-Learning with Gradient-Alignment Reward

At the heart of the standard deep learning training loop is a greedy gra...
01/29/2023

Discrete gradient structure of a second-order variable-step method for nonlinear integro-differential models

The discrete gradient structure and the positive definiteness of discret...
05/30/2023

Flexible Enlarged Conjugate Gradient Methods

Enlarged Krylov subspace methods and their s-step versions were introduc...
12/03/2015

Kalman-based Stochastic Gradient Method with Stop Condition and Insensitivity to Conditioning

Modern proximal and stochastic gradient descent (SGD) methods are believ...