On Extensions of Limited Memory Steepest Descent Method

12/03/2019
by   Qinmeng Zou, et al.
0

We present some extensions to the limited memory steepest descent method based on spectral properties and cyclic iterations. Our aim is to show that it is possible to combine sweep and delayed strategies for improving the performance of gradient methods. Numerical results are reported which indicate that our new methods are better than the original version. Some remarks on the stability and parallel implementation are shown in the end.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2019

Parameter Estimation in the Hermitian and Skew-Hermitian Splitting Method Using Gradient Iterations

This paper presents enhancement strategies for the Hermitian and skew-He...
research
06/27/2012

Scaling Up Coordinate Descent Algorithms for Large ℓ_1 Regularization Problems

We present a generic framework for parallel coordinate descent (CD) algo...
research
06/16/2022

Ultrametric Smale's α-theory

We present a version of Smale's α-theory for ultrametric fields, such as...
research
06/14/2021

Smart Gradient – An Adaptive Technique for Improving Gradient Estimation

Computing the gradient of a function provides fundamental information ab...
research
03/23/2021

A parallel implementation of a diagonalization-based parallel-in-time integrator

We present and analyze a parallel implementation of a parallel-in-time m...
research
08/16/2021

Adaptive Gradient Descent Methods for Computing Implied Volatility

In this paper, a new numerical method based on adaptive gradient descent...
research
11/12/2020

Asymptotic normality of simultaneous estimators of cyclic long-memory processes

Spectral singularities at non-zero frequencies play an important role in...

Please sign up or login with your details

Forgot password? Click here to reset