DeepAI AI Chat
Log In Sign Up

Additive Schwarz Methods for Convex Optimization with Backtracking

10/14/2021
by   Jongho Park, et al.
KAIST 수리과학과
0

This paper presents a novel backtracking strategy for additive Schwarz methods for general convex optimization problems as an acceleration scheme. The proposed backtracking strategy is independent of local solvers, so that it can be applied to any algorithms that can be represented in an abstract framework of additive Schwarz methods. Allowing for adaptive increasing and decreasing of the step size along the iterations, the convergence rate of an algorithm is greatly improved. Improved convergence rate of the algorithm is proven rigorously. In addition, combining the proposed backtracking strategy with a momentum acceleration technique, we propose a further accelerated additive Schwarz method. Numerical results for various convex optimization problems that support our theory are presented.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/05/2020

Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

Based on an observation that additive Schwarz methods for general convex...
12/08/2019

Additive Schwarz Methods for Convex Optimization as Gradient Methods

This paper gives a unified convergence analysis of additive Schwarz meth...
09/11/2018

Smooth Structured Prediction Using Quantum and Classical Gibbs Samplers

We introduce a quantum algorithm for solving structured-prediction probl...
10/12/2022

A Momentum Accelerated Adaptive Cubic Regularization Method for Nonconvex Optimization

The cubic regularization method (CR) and its adaptive version (ARC) are ...
01/30/2023

Reweighted Interacting Langevin Diffusions: an Accelerated Sampling Methodfor Optimization

We proposed a new technique to accelerate sampling methods for solving d...
10/18/2016

Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server

This paper presents an asynchronous incremental aggregated gradient algo...