Accelerated Gradient Descent Learning over Multiple Access Fading Channels

07/26/2021
by   Raz Paul, et al.
0

We consider a distributed learning problem in a wireless network, consisting of N distributed edge devices and a parameter server (PS). The objective function is a sum of the edge devices' local loss functions, who aim to train a shared model by communicating with the PS over multiple access channels (MAC). This problem has attracted a growing interest in distributed sensing systems, and more recently in federated learning, known as over-the-air computation. In this paper, we develop a novel Accelerated Gradient-descent Multiple Access (AGMA) algorithm that uses momentum-based gradient signals over noisy fading MAC to improve the convergence rate as compared to existing methods. Furthermore, AGMA does not require power control or beamforming to cancel the fading effect, which simplifies the implementation complexity. We analyze AGMA theoretically, and establish a finite-sample bound of the error for both convex and strongly convex loss functions with Lipschitz gradient. For the strongly convex case, we show that AGMA approaches the best-known linear convergence rate as the network increases. For the convex case, we show that AGMA significantly improves the sub-linear convergence rate as compared to existing methods. Finally, we present simulation results using real datasets that demonstrate better performance by AGMA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2019

On Analog Gradient Descent Learning over Multiple Access Fading Channels

We consider a distributed learning problem over multiple access channel ...
research
10/30/2020

Fast Convergence Algorithm for Analog Federated Learning

In this paper, we consider federated learning (FL) over a noisy fading m...
research
12/10/2020

DONE: Distributed Newton-type Method for Federated Edge Learning

There is growing interest in applying distributed machine learning to ed...
research
02/26/2020

Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization

Due to the high communication cost in distributed and federated learning...
research
07/25/2021

Revisiting Analog Over-the-Air Machine Learning: The Blessing and Curse of Interference

We study a distributed machine learning problem carried out by an edge s...
research
09/24/2022

Communication-Efficient Federated Learning Using Censored Heavy Ball Descent

Distributed machine learning enables scalability and computational offlo...
research
02/05/2022

Distributed Learning With Sparsified Gradient Differences

A very large number of communications are typically required to solve di...

Please sign up or login with your details

Forgot password? Click here to reset