Federated Learning over Noisy Channels: Convergence Analysis and Design Examples

by   Xizixiang Wei, et al.

Does Federated Learning (FL) work when both uplink and downlink communications have errors? How much communication noise can FL handle and what is its impact to the learning performance? This work is devoted to answering these practically important questions by explicitly incorporating both uplink and downlink noisy channels in the FL pipeline. We present several novel convergence analyses of FL over simultaneous uplink and downlink noisy communication channels, which encompass full and partial clients participation, direct model and model differential transmissions, and non-independent and identically distributed (IID) local datasets. These analyses characterize the sufficient conditions for FL over noisy channels to have the same convergence behavior as the ideal case of no communication error. More specifically, in order to maintain the O(1/T) convergence rate of FedAvg with perfect communications, the uplink and downlink signal-to-noise ratio (SNR) for direct model transmissions should be controlled such that they scale as O(t^2) where t is the index of communication rounds, but can stay constant for model differential transmissions. The key insight of these theoretical results is a "flying under the radar" principle - stochastic gradient descent (SGD) is an inherent noisy process and uplink/downlink communication noises can be tolerated as long as they do not dominate the time-varying SGD noise. We exemplify these theoretical findings with two widely adopted communication techniques - transmit power control and diversity combining - and further validating their performance advantages over the standard methods via extensive numerical experiments using several real-world FL tasks.


page 1

page 2

page 3

page 4


Design and Analysis of Uplink and Downlink Communications for Federated Learning

Communication has been known to be one of the primary bottlenecks of fed...

Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

This letter proposes a novel communication-efficient and privacy-preserv...

Convergence of Federated Learning over a Noisy Downlink

We study federated learning (FL), where power-limited wireless devices u...

Interference Management for Over-the-Air Federated Learning in Multi-Cell Wireless Networks

Federated learning (FL) over resource-constrained wireless networks has ...

On the Stability Analysis of Open Federated Learning Systems

We consider the open federated learning (FL) systems, where clients may ...

Federated Learning over Wireless Device-to-Device Networks: Algorithms and Convergence Analysis

The proliferation of Internet-of-Things (IoT) devices and cloud-computin...

Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees

Federated learning (FL) has emerged as a prominent distributed learning ...