APMSqueeze: A Communication Efficient Adam-Preconditioned Momentum SGD Algorithm

by   Hanlin Tang, et al.

Adam is the important optimization algorithm to guarantee efficiency and accuracy for training many important tasks such as BERT and ImageNet. However, Adam is generally not compatible with information (gradient) compression technology. Therefore, the communication usually becomes the bottleneck for parallelizing Adam. In this paper, we propose a communication efficient ADAM preconditioned Momentum SGD algorithm– named APMSqueeze– through an error compensated method compressing gradients. The proposed algorithm achieves a similar convergence efficiency to Adam in term of epochs, but significantly reduces the running time per epoch. In terms of end-to-end performance (including the full-precision pre-condition step), APMSqueeze is able to provide sometimes by up to 2-10× speed-up depending on network bandwidth. We also conduct theoretical analysis on the convergence and efficiency.


page 1

page 2

page 3

page 4


Communication-Efficient Distributed Blockwise Momentum SGD with Error-Feedback

Communication overhead is a major bottleneck hampering the scalability o...

1-bit Adam: Communication Efficient Large-Scale Training with Adam's Convergence Speed

Scalable training of large models (like BERT and GPT-3) requires careful...

Compressing gradients by exploiting temporal correlation in momentum-SGD

An increasing bottleneck in decentralized optimization is communication....

1-bit LAMB: Communication Efficient Large-Scale Large-Batch Training with LAMB's Convergence Speed

To train large models (like BERT and GPT-3) with hundreds or even thousa...

Linear Convergent Decentralized Optimization with Compression

Communication compression has been extensively adopted to speed up large...

Efficient Riemannian Optimization on the Stiefel Manifold via the Cayley Transform

Strictly enforcing orthonormality constraints on parameter matrices has ...

Periodic Stochastic Gradient Descent with Momentum for Decentralized Training

Decentralized training has been actively studied in recent years. Althou...