DeepAI
Log In Sign Up

Federated Learning Beyond the Star: Local D2D Model Consensus with Global Cluster Sampling

09/07/2021
by   Frank Po-Chen Lin, et al.
0

Federated learning has emerged as a popular technique for distributing model training across the network edge. Its learning architecture is conventionally a star topology between the devices and a central server. In this paper, we propose two timescale hybrid federated learning (TT-HF), which migrates to a more distributed topology via device-to-device (D2D) communications. In TT-HF, local model training occurs at devices via successive gradient iterations, and the synchronization process occurs at two timescales: (i) macro-scale, where global aggregations are carried out via device-server interactions, and (ii) micro-scale, where local aggregations are carried out via D2D cooperative consensus formation in different device clusters. Our theoretical analysis reveals how device, cluster, and network-level parameters affect the convergence of TT-HF, and leads to a set of conditions under which a convergence rate of O(1/t) is guaranteed. Experimental results demonstrate the improvements in convergence and utilization that can be obtained by TT-HF over state-of-the-art federated learning baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/18/2021

Two Timescale Hybrid Federated Learning with Cooperative D2D Local Model Aggregations

Federated learning has emerged as a popular technique for distributing m...
07/18/2020

Multi-Stage Hybrid Federated Learning over Large-Scale Wireless Fog Networks

One of the popular methods for distributed machine learning (ML) is fede...
01/04/2021

Device Sampling for Heterogeneous Federated Learning: Theory, Algorithms, and Implementation

The conventional federated learning (FedL) architecture distributes mach...
04/07/2022

Decentralized Event-Triggered Federated Learning with Heterogeneous Communication Thresholds

A recent emphasis of distributed learning research has been on federated...
11/26/2021

Dynamic Network-Assisted D2D-Aided Coded Distributed Learning

Today, various machine learning (ML) applications offer continuous data ...