Improved Convergence Analysis and SNR Control Strategies for Federated Learning in the Presence of Noise

07/14/2023
by   Antesh Upadhyay, et al.
0

We propose an improved convergence analysis technique that characterizes the distributed learning paradigm of federated learning (FL) with imperfect/noisy uplink and downlink communications. Such imperfect communication scenarios arise in the practical deployment of FL in emerging communication systems and protocols. The analysis developed in this paper demonstrates, for the first time, that there is an asymmetry in the detrimental effects of uplink and downlink communications in FL. In particular, the adverse effect of the downlink noise is more severe on the convergence of FL algorithms. Using this insight, we propose improved Signal-to-Noise (SNR) control strategies that, discarding the negligible higher-order terms, lead to a similar convergence rate for FL as in the case of a perfect, noise-free communication channel while incurring significantly less power resources compared to existing solutions. In particular, we establish that to maintain the O(1/√(K)) rate of convergence like in the case of noise-free FL, we need to scale down the uplink and downlink noise by Ω(√(k)) and Ω(k) respectively, where k denotes the communication round, k=1,…, K. Our theoretical result is further characterized by two major benefits: firstly, it does not assume the somewhat unrealistic assumption of bounded client dissimilarity, and secondly, it only requires smooth non-convex loss functions, a function class better suited for modern machine learning and deep learning models. We also perform extensive empirical analysis to verify the validity of our theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/06/2021

Federated Learning over Noisy Channels: Convergence Analysis and Design Examples

Does Federated Learning (FL) work when both uplink and downlink communic...
research
06/14/2021

CFedAvg: Achieving Efficient Communication and Fast Convergence in Non-IID Federated Learning

Federated learning (FL) is a prevailing distributed learning paradigm, w...
research
12/07/2020

Design and Analysis of Uplink and Downlink Communications for Federated Learning

Communication has been known to be one of the primary bottlenecks of fed...
research
05/27/2022

Client Selection in Nonconvex Federated Learning: Improved Convergence Analysis for Optimal Unbiased Sampling Strategy

Federated learning (FL) is a distributed machine learning paradigm that ...
research
07/14/2023

FedBIAD: Communication-Efficient and Accuracy-Guaranteed Federated Learning with Bayesian Inference-Based Adaptive Dropout

Federated Learning (FL) emerges as a distributed machine learning paradi...
research
07/20/2021

How Does Cell-Free Massive MIMO Support Multiple Federated Learning Groups?

Federated learning (FL) has been considered as a promising learning fram...
research
10/03/2022

Taming Fat-Tailed ("Heavier-Tailed” with Potentially Infinite Variance) Noise in Federated Learning

A key assumption in most existing works on FL algorithms' convergence an...

Please sign up or login with your details

Forgot password? Click here to reset