Over-the-Air Computation Aided Federated Learning with the Aggregation of Normalized Gradient

08/17/2023
by   Rongfei Fan, et al.
0

Over-the-air computation is a communication-efficient solution for federated learning (FL). In such a system, iterative procedure is performed: Local gradient of private loss function is updated, amplified and then transmitted by every mobile device; the server receives the aggregated gradient all-at-once, generates and then broadcasts updated model parameters to every mobile device. In terms of amplification factor selection, most related works suppose the local gradient's maximal norm always happens although it actually fluctuates over iterations, which may degrade convergence performance. To circumvent this problem, we propose to turn local gradient to be normalized one before amplifying it. Under our proposed method, when the loss function is smooth, we prove our proposed method can converge to stationary point at sub-linear rate. In case of smooth and strongly convex loss function, we prove our proposed method can achieve minimal training loss at linear rate with any small positive tolerance. Moreover, a tradeoff between convergence rate and the tolerance is discovered. To speedup convergence, problems optimizing system parameters are also formulated for above two cases. Although being non-convex, optimal solution with polynomial complexity of the formulated problems are derived. Experimental results show our proposed method can outperform benchmark methods on convergence performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2023

Joint Beamforming and Device Selection in Federated Learning with Over-the-air Aggregation

Federated Learning (FL) with over-the-air computation is susceptible to ...
research
03/04/2020

Gradient Statistics Aware Power Control for Over-the-Air Federated Learning in Fading Channels

To enable communication-efficient federated learning, fast model aggrega...
research
10/30/2020

Fast Convergence Algorithm for Analog Federated Learning

In this paper, we consider federated learning (FL) over a noisy fading m...
research
09/28/2019

FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization

Federated learning is a new distributed machine learning approach, where...
research
03/22/2023

Delay-Aware Hierarchical Federated Learning

Federated learning has gained popularity as a means of training models d...
research
06/07/2020

An Efficient Framework for Clustered Federated Learning

We address the problem of Federated Learning (FL) where users are distri...
research
08/17/2023

Joint Power Control and Data Size Selection for Over-the-Air Computation Aided Federated Learning

Federated learning (FL) has emerged as an appealing machine learning app...

Please sign up or login with your details

Forgot password? Click here to reset