Additive Schwarz Methods for Convex Optimization as Gradient Methods

12/08/2019
by   Jongho Park, et al.
0

This paper gives a unified convergence analysis of additive Schwarz methods for general convex optimization problems. Resembling to the fact that additive Schwarz methods for linear problems are preconditioned Richardson methods, we prove that additive Schwarz methods for general convex optimization are in fact gradient methods. Then an abstract framework for convergence analysis of additive Schwarz methods is proposed. The proposed framework applied to linear elliptic problems agrees with the classical theory. We present applications of the proposed framework to various interesting convex optimization problems such as nonlinear elliptic problems, nonsmooth problems, and nonsharp problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2020

Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

Based on an observation that additive Schwarz methods for general convex...
research
10/14/2021

Additive Schwarz Methods for Convex Optimization with Backtracking

This paper presents a novel backtracking strategy for additive Schwarz m...
research
11/22/2021

No-Regret Dynamics in the Fenchel Game: A Unified Framework for Algorithmic Convex Optimization

We develop an algorithmic framework for solving convex optimization prob...
research
12/11/2015

A Unified Approach to Error Bounds for Structured Convex Optimization Problems

Error bounds, which refer to inequalities that bound the distance of vec...
research
03/05/2020

Convex Optimization Over Risk-Neutral Probabilities

We consider a collection of derivatives that depend on the price of an u...
research
09/21/2015

SnapVX: A Network-Based Convex Optimization Solver

SnapVX is a high-performance Python solver for convex optimization probl...

Please sign up or login with your details

Forgot password? Click here to reset