Distributed Proximal Splitting Algorithms with Rates and Acceleration

10/02/2020
by   Laurent Condat, et al.
0

We analyze several generic proximal splitting algorithms well suited for large-scale convex nonsmooth optimization. We derive sublinear and linear convergence results with new rates on the function value suboptimality or distance to the solution, as well as new accelerated versions, using varying stepsizes. In addition, we propose distributed variants of these algorithms, which can be accelerated as well. While most existing results are ergodic, our nonergodic results significantly broaden our understanding of primal-dual optimization algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2022

DISA: A Dual Inexact Splitting Algorithm for Distributed Convex Composite Optimization

This paper proposes a novel dual inexact splitting algorithm (DISA) for ...
research
08/02/2019

Gradient Flows and Accelerated Proximal Splitting Methods

Proximal based methods are well-suited to nonsmooth optimization problem...
research
05/17/2020

From Proximal Point Method to Nesterov's Acceleration

The proximal point method (PPM) is a fundamental method in optimization ...
research
12/06/2022

BALPA: A Balanced Primal-Dual Algorithm for Nonsmooth Optimization with Application to Distributed Optimization

In this paper, we propose a novel primal-dual proximal splitting algorit...
research
06/17/2023

Distributed Accelerated Projection-Based Consensus Decomposition

With the development of machine learning and Big Data, the concepts of l...
research
10/23/2019

Accelerated Primal-Dual Algorithms for Distributed Smooth Convex Optimization over Networks

This paper proposes a novel family of primal-dual-based distributed algo...
research
09/24/2021

Accelerated nonlinear primal-dual hybrid gradient algorithms with applications to machine learning

The primal-dual hybrid gradient (PDHG) algorithm is a first-order method...

Please sign up or login with your details

Forgot password? Click here to reset