Adaptive Gradient Quantization for Data-Parallel SGD

10/23/2020
by   Fartash Faghri, et al.
0

Many communication-efficient variants of SGD use gradient quantization schemes. These schemes are often heuristic and fixed over the course of training. We empirically observe that the statistics of gradients of deep models change during the training. Motivated by this observation, we introduce two adaptive quantization schemes, ALQ and AMQ. In both schemes, processors update their compression schemes in parallel by efficiently computing sufficient statistics of a parametric distribution. We improve the validation accuracy by almost 2 communication setups. Our adaptive methods are also significantly more robust to the choice of hyperparameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/28/2021

NUQSGD: Provably Communication-efficient Data-parallel SGD via Nonuniform Quantization

As the size and complexity of models and datasets grow, so does the need...
research
09/26/2021

Quantization for Distributed Optimization

Massive amounts of data have led to the training of large-scale machine ...
research
07/30/2021

DQ-SGD: Dynamic Quantization in SGD for Communication-Efficient Distributed Learning

Gradient quantization is an emerging technique in reducing communication...
research
08/16/2019

NUQSGD: Improved Communication Efficiency for Data-parallel SGD via Nonuniform Quantization

As the size and complexity of models and datasets grow, so does the need...
research
02/25/2020

Optimal Gradient Quantization Condition for Communication-Efficient Distributed Training

The communication of gradients is costly for training deep neural networ...
research
03/07/2022

A comparative study of several ADPCM schemes with linear and nonlinear prediction

In this paper we compare several ADPCM schemes with nonlinear prediction...
research
02/02/2023

On Suppressing Range of Adaptive Stepsizes of Adam to Improve Generalisation Performance

A number of recent adaptive optimizers improve the generalisation perfor...

Please sign up or login with your details

Forgot password? Click here to reset