Optimizing the Communication-Accuracy Trade-off in Federated Learning with Rate-Distortion Theory

01/07/2022
by   Nicole Mitchell, et al.
14

A significant bottleneck in federated learning is the network communication cost of sending model updates from client devices to the central server. We propose a method to reduce this cost. Our method encodes quantized updates with an appropriate universal code, taking into account their empirical distribution. Because quantization introduces error, we select quantization levels by optimizing for the desired trade-off in average total bitrate and gradient distortion. We demonstrate empirically that in spite of the non-i.i.d. nature of federated learning, the rate-distortion frontier is consistent across datasets, optimizers, clients and training rounds, and within each setting, distortion reliably predicts model performance. This allows for a remarkably simple compression scheme that is near-optimal in many use cases, and outperforms Top-K, DRIVE, 3LC and QSGD on the Stack Overflow next-word prediction benchmark.

READ FULL TEXT
research
02/08/2021

Adaptive Quantization of Model Updates for Communication-Efficient Federated Learning

Communication of model updates between client nodes and the central aggr...
research
08/02/2021

Communication-Efficient Federated Learning via Predictive Coding

Federated learning can enable remote workers to collaboratively train a ...
research
03/15/2023

Communication-Efficient Design for Quantized Decentralized Federated Learning

Decentralized federated learning (DFL) is a variant of federated learnin...
research
01/23/2023

M22: A Communication-Efficient Algorithm for Federated Learning Inspired by Rate-Distortion

In federated learning (FL), the communication constraint between the rem...
research
06/28/2022

Fundamental Limits of Communication Efficiency for Model Aggregation in Distributed Learning: A Rate-Distortion Approach

One of the main focuses in distributed learning is communication efficie...
research
01/28/2022

FedLite: A Scalable Approach for Federated Learning on Resource-constrained Clients

In classical federated learning, the clients contribute to the overall t...
research
07/15/2020

FetchSGD: Communication-Efficient Federated Learning with Sketching

Existing approaches to federated learning suffer from a communication bo...

Please sign up or login with your details

Forgot password? Click here to reset