FedDQ: Communication-Efficient Federated Learning with Descending Quantization

10/05/2021
by   Linping Qu, et al.
0

Federated learning (FL) is an emerging privacy-preserving distributed learning scheme. Due to the large model size and frequent model aggregation, FL suffers from critical communication bottleneck. Many techniques have been proposed to reduce the communication volume, including model compression and quantization, where quantization with increasing number of levels has been proposed. This paper proposes an opposite approach to do adaptive quantization. First, we present the drawback of ascending-trend quantization based on the characteristics of training. Second, we formulate the quantization optimization problem and theoretical analysis shows that quantization with decreasing number of levels is preferred. Then we propose two strategies to guide the adaptive quantization process by using the change in training loss and the range of model update. Experimental results on three sets of benchmarks show that descending-trend quantization not only saves more communication bits but also helps FL converge faster, when compares with current ascending-trend quantization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2021

DAdaQuant: Doubly-adaptive quantization for communication-efficient Federated Learning

Federated Learning (FL) is a powerful technique for training a model on ...
research
02/08/2021

Adaptive Quantization of Model Updates for Communication-Efficient Federated Learning

Communication of model updates between client nodes and the central aggr...
research
06/19/2020

DEED: A General Quantization Scheme for Communication Efficiency in Bits

In distributed optimization, a popular technique to reduce communication...
research
07/20/2023

Communication-Efficient Split Learning via Adaptive Feature-Wise Compression

This paper proposes a novel communication-efficient split learning (SL) ...
research
05/26/2022

QUIC-FL: Quick Unbiased Compression for Federated Learning

Distributed Mean Estimation (DME) is a fundamental building block in com...
research
06/21/2022

sqSGD: Locally Private and Communication Efficient Federated Learning

Federated learning (FL) is a technique that trains machine learning mode...
research
08/23/2022

Joint Privacy Enhancement and Quantization in Federated Learning

Federated learning (FL) is an emerging paradigm for training machine lea...

Please sign up or login with your details

Forgot password? Click here to reset