Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

11/05/2020
by   Jongho Park, et al.
0

Based on an observation that additive Schwarz methods for general convex optimization can be interpreted as gradient methods, we propose an acceleration scheme for additive Schwarz methods. Adopting acceleration techniques developed for gradient methods such as momentum and adaptive restarting, the convergence rate of additive Schwarz methods is greatly improved. The proposed acceleration scheme does not require any a priori information on the levels of smoothness and sharpness of a target energy functional, so that it can be applied to various convex optimization problems. Numerical results for linear elliptic problems, nonlinear elliptic problems, nonsmooth problems, and nonsharp problems are provided to highlight the superiority and the broad applicability of the proposed scheme.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2021

Additive Schwarz Methods for Convex Optimization with Backtracking

This paper presents a novel backtracking strategy for additive Schwarz m...
research
12/08/2019

Additive Schwarz Methods for Convex Optimization as Gradient Methods

This paper gives a unified convergence analysis of additive Schwarz meth...
research
08/19/2023

Additive Schwarz methods for semilinear elliptic problems with convex energy functionals: Convergence rate independent of nonlinearity

We investigate additive Schwarz methods for semilinear elliptic problems...
research
02/18/2021

On Adapting Nesterov's Scheme to Accelerate Iterative Methods for Linear Problems

Nesterov's well-known scheme for accelerating gradient descent in convex...
research
06/01/2018

Nonlinear Acceleration of CNNs

The Regularized Nonlinear Acceleration (RNA) algorithm is an acceleratio...
research
01/23/2021

Acceleration Methods

This monograph covers some recent advances on a range of acceleration te...
research
05/27/2019

Robustness of accelerated first-order algorithms for strongly convex optimization problems

We study the robustness of accelerated first-order algorithms to stochas...

Please sign up or login with your details

Forgot password? Click here to reset