Accelerated Randomized Block-Coordinate Algorithms for Co-coercive Equations and Applications

01/08/2023
โˆ™
by   Quoc Tran-Dinh, et al.
โˆ™
0
โˆ™

In this paper, we develop an accelerated randomized block-coordinate algorithm to approximate a solution of a co-coercive equation. Such an equation plays a central role in optimization and related fields and covers many mathematical models as special cases, including convex optimization, convex-concave minimax, and variational inequality problems. Our algorithm relies on a recent Nesterov's accelerated interpretation of the Halpern fixed-point iteration in [48]. We establish that the new algorithm achieves ๐’ช(1/k^2)-convergence rate on ๐”ผ[โ€– Gx^kโ€–^2] through the last-iterate, where G is the underlying co-coercive operator, ๐”ผ[ยท] is the expectation, and k is the iteration counter. This rate is significantly faster than ๐’ช(1/k) rates in standard forward or gradient-based methods from the literature. We also prove o(1/k^2) rates on both ๐”ผ[โ€– Gx^kโ€–^2] and ๐”ผ[โ€– x^k+1 - x^kโ€–^2]. Next, we apply our method to derive two accelerated randomized block coordinate variants of the forward-backward splitting and Douglas-Rachford splitting schemes, respectively for solving a monotone inclusion involving the sum of two operators. As a byproduct, these variants also have faster convergence rates than their non-accelerated counterparts. Finally, we apply our scheme to a finite-sum monotone inclusion that has various applications in machine learning and statistical learning, including federated learning. As a result, we obtain a novel federated learning-type algorithm with fast and provable convergence rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
โˆ™ 10/15/2021

Halpern-Type Accelerated and Splitting Algorithms For Monotone Inclusions

In this paper, we develop a new type of accelerated algorithms to solve ...
research
โˆ™ 06/11/2018

Convergence Rates for Projective Splitting

Projective splitting is a family of methods for solving inclusions invol...
research
โˆ™ 03/30/2023

Sublinear Convergence Rates of Extragradient-Type Methods: A Survey on Classical and Recent Developments

The extragradient (EG), introduced by G. M. Korpelevich in 1976, is a we...
research
โˆ™ 08/04/2019

Fast Nonoverlapping Block Jacobi Method for the Dual Rudin--Osher--Fatemi Model

We consider nonoverlapping domain decomposition methods for the Rudin--O...
research
โˆ™ 02/08/2023

Extragradient-Type Methods with ๐’ช(1/k) Convergence Rates for Co-Hypomonotone Inclusions

In this paper, we develop two โ€œNesterov's acceleratedโ€ variants of the w...
research
โˆ™ 06/22/2019

A Unifying Framework for Variance Reduction Algorithms for Finding Zeroes of Monotone Operators

A wide range of optimization problems can be recast as monotone inclusio...
research
โˆ™ 11/12/2020

Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

We show that standard extragradient methods (i.e. mirror prox and dual e...

Please sign up or login with your details

Forgot password? Click here to reset