Decentralized Federated Learning via SGD over Wireless D2D Networks

02/28/2020
by   Hong Xing, et al.
0

Federated Learning (FL), an emerging paradigm for fast intelligent acquisition at the network edge, enables joint training of a machine learning model over distributed data sets and computing resources with limited disclosure of local data. Communication is a critical enabler of large-scale FL due to significant amount of model information exchanged among edge devices. In this paper, we consider a network of wireless devices sharing a common fading wireless channel for the deployment of FL. Each device holds a generally distinct training set, and communication typically takes place in a Device-to-Device (D2D) manner. In the ideal case in which all devices within communication range can communicate simultaneously and noiselessly, a standard protocol that is guaranteed to converge to an optimal solution of the global empirical risk minimization problem under convexity and connectivity assumptions is Decentralized Stochastic Gradient Descent (DSGD). DSGD integrates local SGD steps with periodic consensus averages that require communication between neighboring devices. In this paper, wireless protocols are proposed that implement DSGD by accounting for the presence of path loss, fading, blockages, and mutual interference. The proposed protocols are based on graph coloring for scheduling and on both digital and analog transmission strategies at the physical layer, with the latter leveraging over-the-air computing via sparsity-based recovery.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

research
01/29/2021

Federated Learning over Wireless Device-to-Device Networks: Algorithms and Convergence Analysis

The proliferation of Internet-of-Things (IoT) devices and cloud-computin...
research
06/15/2021

Over-the-Air Decentralized Federated Learning

In this paper, we consider decentralized federated learning (FL) over wi...
research
02/03/2020

Cooperative Learning via Federated Distillation over Fading Channels

Cooperative training methods for distributed machine learning are typica...
research
08/05/2021

Multi-task Federated Edge Learning (MtFEEL) in Wireless Networks

Federated Learning (FL) has evolved as a promising technique to handle d...
research
06/11/2019

Optimizing Pipelined Computation and Communication for Latency-Constrained Edge Learning

Consider a device that is connected to an edge processor via a communica...
research
03/06/2020

Decentralized SGD with Over-the-Air Computation

We study the performance of decentralized stochastic gradient descent (D...
research
08/05/2021

Decentralized Federated Learning with Unreliable Communications

Decentralized federated learning, inherited from decentralized learning,...

Please sign up or login with your details

Forgot password? Click here to reset