Halpern-Type Accelerated and Splitting Algorithms For Monotone Inclusions

10/15/2021
โˆ™
by   Quoc Tran-Dinh, et al.
โˆ™
0
โˆ™

In this paper, we develop a new type of accelerated algorithms to solve some classes of maximally monotone equations as well as monotone inclusions. Instead of using Nesterov's accelerating approach, our methods rely on a so-called Halpern-type fixed-point iteration in [32], and recently exploited by a number of researchers, including [24, 70]. Firstly, we derive a new variant of the anchored extra-gradient scheme in [70] based on Popov's past extra-gradient method to solve a maximally monotone equation G(x) = 0. We show that our method achieves the same ๐’ช(1/k) convergence rate (up to a constant factor) as in the anchored extra-gradient algorithm on the operator norm โ€– G(x_k)โ€–, , but requires only one evaluation of G at each iteration, where k is the iteration counter. Next, we develop two splitting algorithms to approximate a zero point of the sum of two maximally monotone operators. The first algorithm originates from the anchored extra-gradient method combining with a splitting technique, while the second one is its Popov's variant which can reduce the per-iteration complexity. Both algorithms appear to be new and can be viewed as accelerated variants of the Douglas-Rachford (DR) splitting method. They both achieve ๐’ช(1/k) rates on the norm โ€– G_ฮณ(x_k)โ€– of the forward-backward residual operator G_ฮณ(ยท) associated with the problem. We also propose a new accelerated Douglas-Rachford splitting scheme for solving this problem which achieves ๐’ช(1/k) convergence rate on โ€– G_ฮณ(x_k)โ€– under only maximally monotone assumptions. Finally, we specify our first algorithm to solve convex-concave minimax problems and apply our accelerated DR scheme to derive a new variant of the alternating direction method of multipliers (ADMM).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
โˆ™ 01/08/2023

Accelerated Randomized Block-Coordinate Algorithms for Co-coercive Equations and Applications

In this paper, we develop an accelerated randomized block-coordinate alg...
research
โˆ™ 02/08/2023

Extragradient-Type Methods with ๐’ช(1/k) Convergence Rates for Co-Hypomonotone Inclusions

In this paper, we develop two โ€œNesterov's acceleratedโ€ variants of the w...
research
โˆ™ 11/08/2022

Frugal and Decentralised Resolvent Splittings Defined by Nonexpansive Operators

Frugal resolvent splittings are a class of fixed point algorithms for fi...
research
โˆ™ 08/02/2019

Gradient Flows and Accelerated Proximal Splitting Methods

Proximal based methods are well-suited to nonsmooth optimization problem...
research
โˆ™ 06/11/2018

Convergence Rates for Projective Splitting

Projective splitting is a family of methods for solving inclusions invol...
research
โˆ™ 06/06/2022

Stable and memory-efficient image recovery using monotone operator learning (MOL)

We introduce a monotone deep equilibrium learning framework for large-sc...
research
โˆ™ 01/11/2018

Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence

We revisit the classical Douglas-Rachford (DR) method for finding a zero...

Please sign up or login with your details

Forgot password? Click here to reset