Connectivity-Aware Semi-Decentralized Federated Learning over Time-Varying D2D Networks

03/15/2023
by   Rohit Parasnis, et al.
0

Semi-decentralized federated learning blends the conventional device to-server (D2S) interaction structure of federated model training with localized device-to-device (D2D) communications. We study this architecture over practical edge networks with multiple D2D clusters modeled as time-varying and directed communication graphs. Our investigation results in an algorithm that controls the fundamental trade-off between (a) the rate of convergence of the model training process towards the global optimizer, and (b) the number of D2S transmissions required for global aggregation. Specifically, in our semi-decentralized methodology, D2D consensus updates are injected into the federated averaging framework based on column-stochastic weight matrices that encapsulate the connectivity within the clusters. To arrive at our algorithm, we show how the expected optimality gap in the current global model depends on the greatest two singular values of the weighted adjacency matrices (and hence on the densities) of the D2D clusters. We then derive tight bounds on these singular values in terms of the node degrees of the D2D clusters, and we use the resulting expressions to design a threshold on the number of clients required to participate in any given global aggregation round so as to ensure a desired convergence rate. Simulations performed on real-world datasets reveal that our connectivity-aware algorithm reduces the total communication cost required to reach a target accuracy significantly compared with baselines depending on the connectivity structure and the learning task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2021

Two Timescale Hybrid Federated Learning with Cooperative D2D Local Model Aggregations

Federated learning has emerged as a popular technique for distributing m...
research
09/07/2021

Federated Learning Beyond the Star: Local D2D Model Consensus with Global Cluster Sampling

Federated learning has emerged as a popular technique for distributing m...
research
06/07/2022

Decentralized Aggregation for Energy-Efficient Federated Learning via Overlapped Clustering and D2D Communications

Federated learning (FL) has emerged as a distributed machine learning (M...
research
02/24/2022

Robust Federated Learning with Connectivity Failures: A Semi-Decentralized Framework with Collaborative Relaying

Intermittent client connectivity is one of the major challenges in centr...
research
02/28/2023

Decentralized Model Dissemination Empowered Federated Learning in mmWave Aerial-Terrestrial Integrated Networks

It is anticipated that aerial-terrestrial integrated networks incorporat...
research
01/29/2021

Federated Learning over Wireless Device-to-Device Networks: Algorithms and Convergence Analysis

The proliferation of Internet-of-Things (IoT) devices and cloud-computin...
research
11/26/2021

Dynamic Network-Assisted D2D-Aided Coded Distributed Learning

Today, various machine learning (ML) applications offer continuous data ...

Please sign up or login with your details

Forgot password? Click here to reset