DeepAI AI Chat
Log In Sign Up

Additive Schwarz Methods for Convex Optimization with Backtracking

by   Jongho Park, et al.
KAIST 수리과학과

This paper presents a novel backtracking strategy for additive Schwarz methods for general convex optimization problems as an acceleration scheme. The proposed backtracking strategy is independent of local solvers, so that it can be applied to any algorithms that can be represented in an abstract framework of additive Schwarz methods. Allowing for adaptive increasing and decreasing of the step size along the iterations, the convergence rate of an algorithm is greatly improved. Improved convergence rate of the algorithm is proven rigorously. In addition, combining the proposed backtracking strategy with a momentum acceleration technique, we propose a further accelerated additive Schwarz method. Numerical results for various convex optimization problems that support our theory are presented.


page 1

page 2

page 3

page 4


Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

Based on an observation that additive Schwarz methods for general convex...

Additive Schwarz Methods for Convex Optimization as Gradient Methods

This paper gives a unified convergence analysis of additive Schwarz meth...

Smooth Structured Prediction Using Quantum and Classical Gibbs Samplers

We introduce a quantum algorithm for solving structured-prediction probl...

A Momentum Accelerated Adaptive Cubic Regularization Method for Nonconvex Optimization

The cubic regularization method (CR) and its adaptive version (ARC) are ...

Reweighted Interacting Langevin Diffusions: an Accelerated Sampling Methodfor Optimization

We proposed a new technique to accelerate sampling methods for solving d...

Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server

This paper presents an asynchronous incremental aggregated gradient algo...