Convergence Rate Improvement of Richardson and Newton-Schulz Iterations

08/26/2020
by   Alexander Stotsky, et al.
0

Fast convergent, accurate, computationally efficient, parallelizable, and robust matrix inversion and parameter estimation algorithms are required in many time-critical and accuracy-critical applications such as system identification, signal and image processing, network and big data analysis, machine learning and in many others. This paper introduces new composite power series expansion with optionally chosen rates (which can be calculated simultaneously on parallel units with different computational capacities) for further convergence rate improvement of high order Newton-Schulz iteration. New expansion was integrated into the Richardson iteration and resulted in significant convergence rate improvement. The improvement is quantified via explicit transient models for estimation errors and by simulations. In addition, the recursive and computationally efficient version of the combination of Richardson iteration and Newton-Schulz iteration with composite expansion is developed for simultaneous calculations. Moreover, unified factorization is developed in this paper in the form of tool-kit for power series expansion, which results in a new family of computationally efficient Newton-Schulz algorithms.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

02/27/2017

A Unifying Framework for Convergence Analysis of Approximate Newton Methods

Many machine learning models are reformulated as optimization problems. ...
08/12/2015

Convergence rates of sub-sampled Newton methods

We consider the problem of minimizing a sum of n functions over a convex...
11/04/2021

Quasi-Newton Methods for Saddle Point Problems

This paper studies quasi-Newton methods for solving strongly-convex-stro...
01/08/2013

A proximal Newton framework for composite minimization: Graph learning without Cholesky decompositions and matrix inversions

We propose an algorithmic framework for convex minimization problems of ...
11/17/2020

Recursive Importance Sketching for Rank Constrained Least Squares: Algorithms and High-order Convergence

In this paper, we propose a new ecursive mportance ketching algorithm fo...
09/17/2018

Zap Meets Momentum: Stochastic Approximation Algorithms with Optimal Convergence Rate

There are two well known Stochastic Approximation techniques that are kn...
01/05/2021

On the convergence rate of the Kačanov scheme for shear-thinning fluids

We explore the convergence rate of the Kačanov iteration scheme for diff...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.