Halpern-Type Accelerated and Splitting Algorithms For Monotone Inclusions

10/15/2021
โˆ™
by   Quoc Tran-Dinh, et al.
โˆ™
0
โˆ™

In this paper, we develop a new type of accelerated algorithms to solve some classes of maximally monotone equations as well as monotone inclusions. Instead of using Nesterov's accelerating approach, our methods rely on a so-called Halpern-type fixed-point iteration in [32], and recently exploited by a number of researchers, including [24, 70]. Firstly, we derive a new variant of the anchored extra-gradient scheme in [70] based on Popov's past extra-gradient method to solve a maximally monotone equation G(x) = 0. We show that our method achieves the same ๐’ช(1/k) convergence rate (up to a constant factor) as in the anchored extra-gradient algorithm on the operator norm โ€– G(x_k)โ€–, , but requires only one evaluation of G at each iteration, where k is the iteration counter. Next, we develop two splitting algorithms to approximate a zero point of the sum of two maximally monotone operators. The first algorithm originates from the anchored extra-gradient method combining with a splitting technique, while the second one is its Popov's variant which can reduce the per-iteration complexity. Both algorithms appear to be new and can be viewed as accelerated variants of the Douglas-Rachford (DR) splitting method. They both achieve ๐’ช(1/k) rates on the norm โ€– G_ฮณ(x_k)โ€– of the forward-backward residual operator G_ฮณ(ยท) associated with the problem. We also propose a new accelerated Douglas-Rachford splitting scheme for solving this problem which achieves ๐’ช(1/k) convergence rate on โ€– G_ฮณ(x_k)โ€– under only maximally monotone assumptions. Finally, we specify our first algorithm to solve convex-concave minimax problems and apply our accelerated DR scheme to derive a new variant of the alternating direction method of multipliers (ADMM).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset