DeepAI AI Chat
Log In Sign Up

Parameter Estimation in the Hermitian and Skew-Hermitian Splitting Method Using Gradient Iterations

09/03/2019
by   Qinmeng Zou, et al.
0

This paper presents enhancement strategies for the Hermitian and skew-Hermitian splitting method based on gradient iterations. The spectral properties are exploited for the parameter estimation, often resulting in a better convergence. In particular, steepest descent with early stopping can generate a rough estimate of the optimal parameter. This is better than an arbitrary choice since the latter often causes stability problems or slow convergence. Additionally, lagged gradient methods are considered as inner solvers for the splitting method. Experiments show that they are competitive with conjugate gradient in low precision.

READ FULL TEXT
12/03/2019

On Extensions of Limited Memory Steepest Descent Method

We present some extensions to the limited memory steepest descent method...
04/19/2020

Stochastic gradient algorithms from ODE splitting perspective

We present a different view on stochastic optimization, which goes back ...
10/23/2016

Fast and Reliable Parameter Estimation from Nonlinear Observations

In this paper we study the problem of recovering a structured but unknow...
01/30/2023

Parameter estimation for cellular automata

Self organizing complex systems can be modeled using cellular automaton ...
03/23/2020

Steepest Descent Neural Architecture Optimization: Escaping Local Optimum with Signed Neural Splitting

We propose signed splitting steepest descent (S3D), which progressively ...