Non-Coherent Over-the-Air Decentralized Stochastic Gradient Descent

11/19/2022
by   Nicolo Michelusi, et al.
0

This paper proposes a Decentralized Stochastic Gradient Descent (DSGD) algorithm to solve distributed machine-learning tasks over wirelessly-connected systems, without the coordination of a base station. It combines local stochastic gradient descent steps with a Non-Coherent Over-The-Air (NCOTA) consensus scheme at the receivers, that enables concurrent transmissions by leveraging the waveform superposition properties of the wireless channels. With NCOTA, local optimization signals are mapped to a mixture of orthogonal preamble sequences and transmitted concurrently over the wireless channel under half-duplex constraints. Consensus is estimated by non-coherently combining the received signals with the preamble sequences and mitigating the impact of noise and fading via a consensus stepsize. NCOTA-DSGD operates without channel state information (typically used in over-the-air computation schemes for channel inversion) and leverages the channel pathloss to mix signals, without explicit knowledge of the mixing weights (typically known in consensus-based optimization). It is shown that, with a suitable tuning of decreasing consensus and learning stepsizes, the error (measured as Euclidean distance) between the local and globally optimum models vanishes with rate 𝒪(k^-1/4) after k iterations. NCOTA-DSGD is evaluated numerically by solving an image classification task on the MNIST dataset, cast as a regularized cross-entropy loss minimization. Numerical results depict faster convergence vis-à-vis running time than implementations of the classical DSGD algorithm over digital and analog orthogonal channels, when the number of learning devices is large, under stringent delay constraints.

READ FULL TEXT
research
10/27/2022

Decentralized Federated Learning via Non-Coherent Over-the-Air Consensus

This paper presents NCOTA-DGD, a Decentralized Gradient Descent (DGD) al...
research
03/06/2020

Decentralized SGD with Over-the-Air Computation

We study the performance of decentralized stochastic gradient descent (D...
research
07/23/2019

Federated Learning over Wireless Fading Channels

We study federated machine learning at the wireless network edge, where ...
research
07/08/2019

Collaborative Machine Learning at the Wireless Edge with Blind Transmitters

We study wireless collaborative machine learning (ML), where mobile edge...
research
01/03/2019

Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air

We study collaborative machine learning at the wireless edge, where powe...
research
08/20/2019

On Analog Gradient Descent Learning over Multiple Access Fading Channels

We consider a distributed learning problem over multiple access channel ...

Please sign up or login with your details

Forgot password? Click here to reset