UVeQFed: Universal Vector Quantization for Federated Learning

06/05/2020
by   Nir Shlezinger, et al.
0

Traditional deep learning models are trained at a centralized server using labeled data samples collected from end devices or users. Such data samples often include private information, which the users may not be willing to share. Federated learning (FL) is an emerging approach to train such learning models without requiring the users to share their possibly private labeled data. In FL, each user trains its copy of the learning model locally. The server then collects the individual updates and aggregates them into a global model. A major challenge that arises in this method is the need of each user to efficiently transmit its learned model over the throughput limited uplink channel. In this work, we tackle this challenge using tools from quantization theory. In particular, we identify the unique characteristics associated with conveying trained models over rate-constrained channels, and propose a suitable quantization scheme for such settings, referred to as universal vector quantization for FL (UVeQFed). We show that combining universal vector quantization methods with FL yields a decentralized training system in which the compression of the trained models induces only a minimum distortion. We then theoretically analyze the distortion, showing that it vanishes as the number of users grows. We also characterize the convergence of models trained with the traditional federated averaging method combined with UVeQFed to the model which minimizes the loss function. Our numerical results demonstrate the gains of UVeQFed over previously proposed methods in terms of both distortion induced in quantization and accuracy of the resulting aggregated model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2022

Joint Privacy Enhancement and Quantization in Federated Learning

Federated learning (FL) is an emerging paradigm for training machine lea...
research
06/22/2022

Quantization Robust Federated Learning for Efficient Inference on Heterogeneous Devices

Federated Learning (FL) is a machine learning paradigm to distributively...
research
09/27/2020

Over-the-Air Federated Learning from Heterogeneous Data

Federated learning (FL) is a framework for distributed learning of centr...
research
01/23/2020

Communication Efficient Federated Learning over Multiple Access Channels

In this work, we study the problem of federated learning (FL), where dis...
research
07/14/2021

Communication-Efficient Hierarchical Federated Learning for IoT Heterogeneous Systems with Imbalanced Data

Federated learning (FL) is a distributed learning methodology that allow...
research
11/23/2020

Federated learning with class imbalance reduction

Federated learning (FL) is a promising technique that enables a large am...
research
02/06/2022

Lossy Gradient Compression: How Much Accuracy Can One Bit Buy?

In federated learning (FL), a global model is trained at a Parameter Ser...

Please sign up or login with your details

Forgot password? Click here to reset