A New Cyclic Gradient Method Adapted to Large-Scale Linear Systems

07/02/2019
by   Qinmeng Zou, et al.
0

This paper proposes a new gradient method to solve the large-scale problems. Theoretical analysis shows that the new method has finite termination property for two dimensions and converges R-linearly for any dimensions. Experimental results illustrate first the issue of parallel implementation. Then, the solution of a large-scale problem shows that the new method is better than the others, even competitive with the conjugate gradient method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2022

Randomized block subsampling Kaczmarz-Motzkin method

By introducing a subsampling strategy, we propose a randomized block Kac...
research
09/15/2019

A simple discriminative training method for machine translation with large-scale features

Margin infused relaxed algorithms (MIRAs) dominate model tuning in stati...
research
05/31/2017

FALKON: An Optimal Large Scale Kernel Method

Kernel methods provide a principled way to perform non linear, nonparame...
research
07/19/2019

On Linear Convergence of Weighted Kernel Herding

We provide a novel convergence analysis of two popular sampling algorith...
research
06/08/2022

Thick-restarted joint Lanczos bidiagonalization for the GSVD

The computation of the partial generalized singular value decomposition ...
research
07/10/2022

On finite termination of the generalized Newton method for solving absolute value equations

Motivated by the framework constructed by Brugnano and Casulli [SIAM J. ...
research
07/25/2023

High Dimensional Distributed Gradient Descent with Arbitrary Number of Byzantine Attackers

Robust distributed learning with Byzantine failures has attracted extens...

Please sign up or login with your details

Forgot password? Click here to reset