Federated Learning over Noisy Channels: Convergence Analysis and Design Examples

01/06/2021
by   Xizixiang Wei, et al.
13

Does Federated Learning (FL) work when both uplink and downlink communications have errors? How much communication noise can FL handle and what is its impact to the learning performance? This work is devoted to answering these practically important questions by explicitly incorporating both uplink and downlink noisy channels in the FL pipeline. We present several novel convergence analyses of FL over simultaneous uplink and downlink noisy communication channels, which encompass full and partial clients participation, direct model and model differential transmissions, and non-independent and identically distributed (IID) local datasets. These analyses characterize the sufficient conditions for FL over noisy channels to have the same convergence behavior as the ideal case of no communication error. More specifically, in order to maintain the O(1/T) convergence rate of FedAvg with perfect communications, the uplink and downlink signal-to-noise ratio (SNR) for direct model transmissions should be controlled such that they scale as O(t^2) where t is the index of communication rounds, but can stay constant for model differential transmissions. The key insight of these theoretical results is a "flying under the radar" principle - stochastic gradient descent (SGD) is an inherent noisy process and uplink/downlink communication noises can be tolerated as long as they do not dominate the time-varying SGD noise. We exemplify these theoretical findings with two widely adopted communication techniques - transmit power control and diversity combining - and further validating their performance advantages over the standard methods via extensive numerical experiments using several real-world FL tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/07/2020

Design and Analysis of Uplink and Downlink Communications for Federated Learning

Communication has been known to be one of the primary bottlenecks of fed...
06/17/2020

Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

This letter proposes a novel communication-efficient and privacy-preserv...
08/25/2020

Convergence of Federated Learning over a Noisy Downlink

We study federated learning (FL), where power-limited wireless devices u...
06/06/2022

Interference Management for Over-the-Air Federated Learning in Multi-Cell Wireless Networks

Federated learning (FL) over resource-constrained wireless networks has ...
09/25/2022

On the Stability Analysis of Open Federated Learning Systems

We consider the open federated learning (FL) systems, where clients may ...
01/29/2021

Federated Learning over Wireless Device-to-Device Networks: Algorithms and Convergence Analysis

The proliferation of Internet-of-Things (IoT) devices and cloud-computin...
02/25/2020

Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees

Federated learning (FL) has emerged as a prominent distributed learning ...