Adapted AZNN Methods for Time-Varying and Static Matrix Problems
We present adapted Zhang Neural Networks (AZNN) in which the parameter settings for the exponential decay constant η and the length of the start-up phase of basic ZNN are adapted to the problem at hand. Specifically we study experiments with AZNN for time-varying square matrix factorizations as a product of time-varying symmetric matrices and for the time-varying matrix square roots problem. Differing from generally used small η values and minimal start-up length phases in ZNN, we adapt the basic ZNN method to work with large or even gigantic η settings and arbitrary length start-ups using Euler's low accuracy finite difference formula. These adaptations improve the speed of AZNN's convergence and lower its solution error bounds for our chosen problems significantly to near machine constant or even lower levels. Parameter-varying AZNN also allows us to find full rank symmetrizers of static matrices reliably, for example for the Kahan and Frank matrices and for matrices with highly ill-conditioned eigenvalues and complicated Jordan structures of dimensions from n = 2 on up. This helps in cases where full rank static matrix symmetrizers have never been successfully computed before.
READ FULL TEXT