-
Analysis of SGD with Biased Gradient Estimators
We analyze the complexity of biased stochastic gradient methods (SGD), w...
read it
-
On Communication Compression for Distributed Optimization on Heterogeneous Data
Lossy gradient compression, with either unbiased or biased compressors, ...
read it
-
signSGD: compressed optimisation for non-convex problems
Training large neural networks requires distributing learning across mul...
read it
-
Linearly Converging Error Compensated SGD
In this paper, we propose a unified analysis of variants of distributed ...
read it
-
Communication-Efficient Distributed Blockwise Momentum SGD with Error-Feedback
Communication overhead is a major bottleneck hampering the scalability o...
read it
-
On Efficient Constructions of Checkpoints
Efficient construction of checkpoints/snapshots is a critical tool for t...
read it
-
Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees
Federated learning (FL) has emerged as a prominent distributed learning ...
read it
Error Feedback Fixes SignSGD and other Gradient Compression Schemes
Sign-based algorithms (e.g. signSGD) have been proposed as a biased gradient compression technique to alleviate the communication bottleneck in training large neural networks across multiple workers. We show simple convex counter-examples where signSGD does not converge to the optimum. Further, even when it does converge, signSGD may generalize poorly when compared with SGD. These issues arise because of the biased nature of the sign compression operator. We then show that using error-feedback, i.e. incorporating the error made by the compression operator into the next step, overcomes these issues. We prove that our algorithm EF-SGD achieves the same rate of convergence as SGD without any additional assumptions for arbitrary compression operators (including the sign operator), indicating that we get gradient compression for free. Our experiments thoroughly substantiate the theory showing the superiority of our algorithm.
READ FULL TEXT
Comments
There are no comments yet.