Federated Learning over Noisy Channels: Convergence Analysis and Design Examples

01/06/2021
by   Xizixiang Wei, et al.
13

Does Federated Learning (FL) work when both uplink and downlink communications have errors? How much communication noise can FL handle and what is its impact to the learning performance? This work is devoted to answering these practically important questions by explicitly incorporating both uplink and downlink noisy channels in the FL pipeline. We present several novel convergence analyses of FL over simultaneous uplink and downlink noisy communication channels, which encompass full and partial clients participation, direct model and model differential transmissions, and non-independent and identically distributed (IID) local datasets. These analyses characterize the sufficient conditions for FL over noisy channels to have the same convergence behavior as the ideal case of no communication error. More specifically, in order to maintain the O(1/T) convergence rate of FedAvg with perfect communications, the uplink and downlink signal-to-noise ratio (SNR) for direct model transmissions should be controlled such that they scale as O(t^2) where t is the index of communication rounds, but can stay constant for model differential transmissions. The key insight of these theoretical results is a "flying under the radar" principle - stochastic gradient descent (SGD) is an inherent noisy process and uplink/downlink communication noises can be tolerated as long as they do not dominate the time-varying SGD noise. We exemplify these theoretical findings with two widely adopted communication techniques - transmit power control and diversity combining - and further validating their performance advantages over the standard methods via extensive numerical experiments using several real-world FL tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2023

Improved Convergence Analysis and SNR Control Strategies for Federated Learning in the Presence of Noise

We propose an improved convergence analysis technique that characterizes...
research
12/07/2020

Design and Analysis of Uplink and Downlink Communications for Federated Learning

Communication has been known to be one of the primary bottlenecks of fed...
research
06/17/2020

Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

This letter proposes a novel communication-efficient and privacy-preserv...
research
09/19/2023

RIS-Assisted Over-the-Air Adaptive Federated Learning with Noisy Downlink

Over-the-air federated learning (OTA-FL) exploits the inherent superposi...
research
08/25/2020

Convergence of Federated Learning over a Noisy Downlink

We study federated learning (FL), where power-limited wireless devices u...
research
08/16/2023

Stochastic Controlled Averaging for Federated Learning with Communication Compression

Communication compression, a technique aiming to reduce the information ...
research
06/11/2023

Analysis of a contention-based approach over 5G NR for Federated Learning in an Industrial Internet of Things scenario

The growing interest in new applications involving co-located heterogene...

Please sign up or login with your details

Forgot password? Click here to reset