On Extensions of Limited Memory Steepest Descent Method

by   Qinmeng Zou, et al.

We present some extensions to the limited memory steepest descent method based on spectral properties and cyclic iterations. Our aim is to show that it is possible to combine sweep and delayed strategies for improving the performance of gradient methods. Numerical results are reported which indicate that our new methods are better than the original version. Some remarks on the stability and parallel implementation are shown in the end.


page 1

page 2

page 3

page 4


Parameter Estimation in the Hermitian and Skew-Hermitian Splitting Method Using Gradient Iterations

This paper presents enhancement strategies for the Hermitian and skew-He...

Scaling Up Coordinate Descent Algorithms for Large ℓ_1 Regularization Problems

We present a generic framework for parallel coordinate descent (CD) algo...

Ultrametric Smale's α-theory

We present a version of Smale's α-theory for ultrametric fields, such as...

Smart Gradient – An Adaptive Technique for Improving Gradient Estimation

Computing the gradient of a function provides fundamental information ab...

A parallel implementation of a diagonalization-based parallel-in-time integrator

We present and analyze a parallel implementation of a parallel-in-time m...

Adaptive Gradient Descent Methods for Computing Implied Volatility

In this paper, a new numerical method based on adaptive gradient descent...

Asymptotic normality of simultaneous estimators of cyclic long-memory processes

Spectral singularities at non-zero frequencies play an important role in...

Please sign up or login with your details

Forgot password? Click here to reset