DeepAI AI Chat
Log In Sign Up

Parameter Estimation in the Hermitian and Skew-Hermitian Splitting Method Using Gradient Iterations

by   Qinmeng Zou, et al.

This paper presents enhancement strategies for the Hermitian and skew-Hermitian splitting method based on gradient iterations. The spectral properties are exploited for the parameter estimation, often resulting in a better convergence. In particular, steepest descent with early stopping can generate a rough estimate of the optimal parameter. This is better than an arbitrary choice since the latter often causes stability problems or slow convergence. Additionally, lagged gradient methods are considered as inner solvers for the splitting method. Experiments show that they are competitive with conjugate gradient in low precision.


On Extensions of Limited Memory Steepest Descent Method

We present some extensions to the limited memory steepest descent method...

Stochastic gradient algorithms from ODE splitting perspective

We present a different view on stochastic optimization, which goes back ...

Fast and Reliable Parameter Estimation from Nonlinear Observations

In this paper we study the problem of recovering a structured but unknow...

Parameter estimation for cellular automata

Self organizing complex systems can be modeled using cellular automaton ...

Steepest Descent Neural Architecture Optimization: Escaping Local Optimum with Signed Neural Splitting

We propose signed splitting steepest descent (S3D), which progressively ...