Federated Learning With Quantized Global Model Updates

06/18/2020
by   Mohammad Mohammadi Amiri, et al.
0

We study federated learning (FL), which enables mobile devices to utilize their local datasets to collaboratively train a global model with the help of a central server, while keeping data localized. At each iteration, the server broadcasts the current global model to the devices for local training, and aggregates the local model updates from the devices to update the global model. Previous work on the communication efficiency of FL has mainly focused on the aggregation of model updates from the devices, assuming perfect broadcasting of the global model. In this paper, we instead consider broadcasting a compressed version of the global model. This is to further reduce the communication cost of FL, which can be particularly limited when the global model is to be transmitted over a wireless medium. We introduce a lossy FL (LFL) algorithm, in which both the global model and the local model updates are quantized before being transmitted. We analyze the convergence behavior of the proposed LFL algorithm assuming the availability of accurate local model updates at the server. Numerical experiments show that the quantization of the global model can actually improve the performance for non-iid data distributions. This observation is corroborated with analytical convergence results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2022

Federated Learning with Erroneous Communication Links

In this paper, we consider the federated learning (FL) problem in the pr...
research
10/27/2021

Spatio-Temporal Federated Learning for Massive Wireless Edge Networks

This paper presents a novel approach to conduct highly efficient federat...
research
11/01/2019

Energy-Aware Analog Aggregation for Federated Learning with Redundant Data

Federated learning (FL) enables workers to learn a model collaboratively...
research
04/09/2022

Adaptive Differential Filters for Fast and Communication-Efficient Federated Learning

Federated learning (FL) scenarios inherently generate a large communicat...
research
11/01/2021

To Talk or to Work: Delay Efficient Federated Learning over Mobile Edge Devices

Federated learning (FL), an emerging distributed machine learning paradi...
research
08/25/2020

Convergence of Federated Learning over a Noisy Downlink

We study federated learning (FL), where power-limited wireless devices u...
research
09/21/2022

Performance Optimization for Variable Bitwidth Federated Learning in Wireless Networks

This paper considers improving wireless communication and computation ef...

Please sign up or login with your details

Forgot password? Click here to reset