Distributed Optimization for Over-Parameterized Learning

06/14/2019
by   Chi Zhang, et al.
0

Distributed optimization often consists of two updating phases: local optimization and inter-node communication. Conventional approaches require working nodes to communicate with the server every one or few iterations to guarantee convergence. In this paper, we establish a completely different conclusion that each node can perform an arbitrary number of local optimization steps before communication. Moreover, we show that the more local updating can reduce the overall communication, even for an infinity number of steps where each node is free to update its local model to near-optimality before exchanging information. The extra assumption we make is that the optimal sets of local loss functions have a non-empty intersection, which is inspired by the over-paramterization phenomenon in large-scale optimization and deep learning. Our theoretical findings are confirmed by both distributed convex optimization and deep learning experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2022

The Role of Local Steps in Local SGD

We consider the distributed stochastic optimization problem where n agen...
research
07/12/2012

Distributed Strongly Convex Optimization

A lot of effort has been invested into characterizing the convergence ra...
research
11/20/2020

On the Benefits of Multiple Gossip Steps in Communication-Constrained Decentralized Optimization

In decentralized optimization, it is common algorithmic practice to have...
research
05/28/2022

Efficient-Adam: Communication-Efficient Distributed Adam with Complexity Analysis

Distributed adaptive stochastic gradient methods have been widely used f...
research
08/13/2018

AsySPA: An Exact Asynchronous Algorithm for Convex Optimization Over Digraphs

This paper proposes a novel exact asynchronous subgradient-push algorith...
research
03/31/2018

Locally Convex Sparse Learning over Networks

We consider a distributed learning setup where a sparse signal is estima...
research
07/06/2020

Deep Partial Updating

Emerging edge intelligence applications require the server to continuous...

Please sign up or login with your details

Forgot password? Click here to reset