Distributed Stochastic Consensus Optimization with Momentum for Nonconvex Nonsmooth Problems

11/10/2020
by   Zhiguo Wang, et al.
0

While many distributed optimization algorithms have been proposed for solving smooth or convex problems over the networks, few of them can handle non-convex and non-smooth problems. Based on a proximal primal-dual approach, this paper presents a new (stochastic) distributed algorithm with Nesterov momentum for accelerated optimization of non-convex and non-smooth problems. Theoretically, we show that the proposed algorithm can achieve an ϵ-stationary solution under a constant step size with 𝒪(1/ϵ^2) computation complexity and 𝒪(1/ϵ) communication complexity. When compared to the existing gradient tracking based methods, the proposed algorithm has the same order of computation complexity but lower order of communication complexity. To the best of our knowledge, the presented result is the first stochastic algorithm with the 𝒪(1/ϵ) communication complexity for non-convex and non-smooth problems. Numerical experiments for a distributed non-convex regression problem and a deep neural network based classification problem are presented to illustrate the effectiveness of the proposed algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2018

Distributed Non-Convex First-Order Optimization and Information Processing: Lower Complexity Bounds and Rate Optimal Algorithms

We consider a class of distributed non-convex optimization problems ofte...
research
06/06/2021

Minibatch and Momentum Model-based Methods for Stochastic Non-smooth Non-convex Optimization

Stochastic model-based methods have received increasing attention lately...
research
10/11/2022

Stochastic Constrained DRO with a Complexity Independent of Sample Size

Distributionally Robust Optimization (DRO), as a popular method to train...
research
02/13/2020

Convergence of a Stochastic Gradient Method with Momentum for Nonsmooth Nonconvex Optimization

Stochastic gradient methods with momentum are widely used in application...
research
12/12/2019

Parallel Restarted SPIDER – Communication Efficient Distributed Nonconvex Optimization with Optimal Computation Complexity

In this paper, we propose a distributed algorithm for stochastic smooth,...
research
05/05/2020

Communication-Efficient Distributed Stochastic AUC Maximization with Deep Neural Networks

In this paper, we study distributed algorithms for large-scale AUC maxim...
research
03/01/2023

Composite Optimization Algorithms for Sigmoid Networks

In this paper, we use composite optimization algorithms to solve sigmoid...

Please sign up or login with your details

Forgot password? Click here to reset