DESTRESS: Computation-Optimal and Communication-Efficient Decentralized Nonconvex Finite-Sum Optimization

10/04/2021
by   Boyue Li, et al.
4

Emerging applications in multi-agent environments such as internet-of-things, networked sensing, autonomous systems and federated learning, call for decentralized algorithms for finite-sum optimizations that are resource-efficient in terms of both computation and communication. In this paper, we consider the prototypical setting where the agents work collaboratively to minimize the sum of local loss functions by only communicating with their neighbors over a predetermined network topology. We develop a new algorithm, called DEcentralized STochastic REcurSive gradient methodS (DESTRESS) for nonconvex finite-sum optimization, which matches the optimal incremental first-order oracle (IFO) complexity of centralized algorithms for finding first-order stationary points, while maintaining communication efficiency. Detailed theoretical and numerical comparisons corroborate that the resource efficiencies of DESTRESS improve upon prior decentralized algorithms over a wide range of parameter regimes. DESTRESS leverages several key algorithm design ideas including stochastic recursive gradient updates with mini-batches for local computation, gradient tracking with extra mixing (i.e., multiple gossiping rounds) for per-iteration communication, together with careful choices of hyper-parameters and new analysis frameworks to provably achieve a desirable computation-communication trade-off.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/25/2022

An Optimal Stochastic Algorithm for Decentralized Nonconvex Finite-sum Optimization

This paper studies the synchronized decentralized nonconvex optimization...
research
09/12/2019

Communication-Efficient Distributed Optimization in Networks with Gradient Tracking

There is a growing interest in large-scale machine learning and optimiza...
research
10/10/2022

On the Performance of Gradient Tracking with Local Updates

We study the decentralized optimization problem where a network of n age...
research
12/05/2022

A Simple and Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

This paper studies the stochastic optimization for decentralized nonconv...
research
04/18/2018

Walkman: A Communication-Efficient Random-Walk Algorithm for Decentralized Optimization

This paper addresses consensus optimization problems in a multi-agent ne...
research
02/02/2022

DASHA: Distributed Nonconvex Optimization with Communication Compression, Optimal Oracle Complexity, and No Client Synchronization

We develop and analyze DASHA: a new family of methods for nonconvex dist...
research
11/08/2022

A Penalty Based Method for Communication-Efficient Decentralized Bilevel Programming

Bilevel programming has recently received attention in the literature, d...

Please sign up or login with your details

Forgot password? Click here to reset