Convergence of Federated Learning over a Noisy Downlink

08/25/2020
by   Mohammad Mohammadi Amiri, et al.
4

We study federated learning (FL), where power-limited wireless devices utilize their local datasets to collaboratively train a global model with the help of a remote parameter server (PS). The PS has access to the global model and shares it with the devices for local training, and the devices return the result of their local updates to the PS to update the global model. This framework requires downlink transmission from the PS to the devices and uplink transmission from the devices to the PS. The goal of this study is to investigate the impact of the bandwidth-limited shared wireless medium in both the downlink and uplink on the performance of FL with a focus on the downlink. To this end, the downlink and uplink channels are modeled as fading broadcast and multiple access channels, respectively, both with limited bandwidth. For downlink transmission, we first introduce a digital approach, where a quantization technique is employed at the PS to broadcast the global model update at a common rate such that all the devices can decode it. Next, we propose analog downlink transmission, where the global model is broadcast by the PS in an uncoded manner. We consider analog transmission over the uplink in both cases. We further analyze the convergence behavior of the proposed analog approach assuming that the uplink transmission is error-free. Numerical experiments show that the analog downlink approach provides significant improvement over the digital one, despite a significantly lower transmit power at the PS. The experimental results corroborate the convergence results, and show that a smaller number of local iterations should be used when the data distribution is more biased, and also when the devices have a better estimate of the global model in the analog downlink approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2020

Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

This letter proposes a novel communication-efficient and privacy-preserv...
research
06/18/2020

Federated Learning With Quantized Global Model Updates

We study federated learning (FL), which enables mobile devices to utiliz...
research
11/29/2022

Multi-Server Over-the-Air Federated Learning

In this work, we propose a communication-efficient two-layer federated l...
research
06/25/2018

Outage of Periodic Downlink Wireless Networks with Hard Deadlines

We consider a downlink periodic wireless communications system where mul...
research
07/01/2023

Joint Downlink-Uplink Beamforming for Wireless Multi-Antenna Federated Learning

We study joint downlink-uplink beamforming design for wireless federated...
research
01/06/2021

Federated Learning over Noisy Channels: Convergence Analysis and Design Examples

Does Federated Learning (FL) work when both uplink and downlink communic...
research
12/07/2020

Design and Analysis of Uplink and Downlink Communications for Federated Learning

Communication has been known to be one of the primary bottlenecks of fed...

Please sign up or login with your details

Forgot password? Click here to reset