APMSqueeze: A Communication Efficient Adam-Preconditioned Momentum SGD Algorithm

08/26/2020 ∙ by Hanlin Tang, et al. ∙ 11

Adam is the important optimization algorithm to guarantee efficiency and accuracy for training many important tasks such as BERT and ImageNet. However, Adam is generally not compatible with information (gradient) compression technology. Therefore, the communication usually becomes the bottleneck for parallelizing Adam. In this paper, we propose a communication efficient ADAM preconditioned Momentum SGD algorithm– named APMSqueeze– through an error compensated method compressing gradients. The proposed algorithm achieves a similar convergence efficiency to Adam in term of epochs, but significantly reduces the running time per epoch. In terms of end-to-end performance (including the full-precision pre-condition step), APMSqueeze is able to provide sometimes by up to 2-10× speed-up depending on network bandwidth. We also conduct theoretical analysis on the convergence and efficiency.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.