Decentralized Federated Learning with Unreliable Communications

08/05/2021
by   Hao Ye, et al.
0

Decentralized federated learning, inherited from decentralized learning, enables the edge devices to collaborate on model training in a peer-to-peer manner without the assistance of a server. However, existing decentralized learning frameworks usually assume perfect communication among devices, where they can reliably exchange messages, e.g., gradients or parameters. But the real-world communication networks are prone to packet loss and transmission errors. Transmission reliability comes with a price. The commonly-used solution is to adopt a reliable transportation layer protocol, e.g., transmission control protocol (TCP), which however leads to significant communication overhead and reduces connectivity among devices that can be supported. For a communication network with a lightweight and unreliable communication protocol, user datagram protocol (UDP), we propose a robust decentralized stochastic gradient descent (SGD) approach, called Soft-DSGD, to address the unreliability issue. Soft-DSGD updates the model parameters with partially received messages and optimizes the mixing weights according to the link reliability matrix of communication links. We prove that the proposed decentralized training system, even with unreliable communications, can still achieve the same asymptotic convergence rate as vanilla decentralized SGD with perfect communications. Moreover, numerical results confirm the proposed approach can leverage all available unreliable communication links to speed up convergence.

READ FULL TEXT
research
07/26/2021

Decentralized Federated Learning: Balancing Communication and Computing Costs

Decentralized federated learning (DFL) is a powerful framework of distri...
research
05/30/2021

PPT: A Privacy-Preserving Global Model Training Protocol for Federated Learning in P2P Networks

The concept of Federated Learning has emerged as a convergence of distri...
research
07/09/2018

Efficient Decentralized Deep Learning by Dynamic Model Averaging

We propose an efficient protocol for decentralized training of deep neur...
research
06/17/2020

Communication-Efficient Robust Federated Learning Over Heterogeneous Datasets

This work investigates fault-resilient federated learning when the data ...
research
02/28/2020

Decentralized Federated Learning via SGD over Wireless D2D Networks

Federated Learning (FL), an emerging paradigm for fast intelligent acqui...
research
09/18/2023

A Multi-Token Coordinate Descent Method for Semi-Decentralized Vertical Federated Learning

Communication efficiency is a major challenge in federated learning (FL)...
research
06/11/2019

Optimizing Pipelined Computation and Communication for Latency-Constrained Edge Learning

Consider a device that is connected to an edge processor via a communica...

Please sign up or login with your details

Forgot password? Click here to reset