Timely Communication in Federated Learning

12/31/2020
by   Baturalp Buyukates, et al.
0

We consider a federated learning framework in which a parameter server (PS) trains a global model by using n clients without actually storing the client data centrally at a cloud server. Focusing on a setting where the client datasets are highly changing and temporal in nature, we investigate the timeliness of model updates and propose a novel timely communication scheme. Under the proposed scheme, at each iteration, the PS waits for m available clients and sends them the current model. Then, the PS uses the local updates of the earliest k out of m clients to update the global model at each iteration. We find the average age of information experienced by each client and characterize the age-optimal m and k values for a given n. Our results indicate that, in addition to ensuring timeliness, the proposed communication scheme results in significantly smaller average iteration times compared to random client selection without hurting the convergence of the global learning task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2023

Timely Asynchronous Hierarchical Federated Learning: Age of Convergence

We consider an asynchronous hierarchical federated learning (AHFL) setti...
research
02/27/2023

Communication Trade-offs in Federated Learning of Spiking Neural Networks

Spiking Neural Networks (SNNs) are biologically inspired alternatives to...
research
09/14/2021

Fast Federated Edge Learning with Overlapped Communication and Computation and Channel-Aware Fair Client Scheduling

We consider federated edge learning (FEEL) over wireless fading channels...
research
02/24/2022

Robust Federated Learning with Connectivity Failures: A Semi-Decentralized Framework with Collaborative Relaying

Intermittent client connectivity is one of the major challenges in centr...
research
12/11/2022

Client Selection for Federated Bayesian Learning

Distributed Stein Variational Gradient Descent (DSVGD) is a non-parametr...
research
10/26/2020

Optimal Client Sampling for Federated Learning

It is well understood that client-master communication can be a primary ...
research
12/30/2019

Robust Federated Learning Through Representation Matching and Adaptive Hyper-parameters

Federated learning is a distributed, privacy-aware learning scenario whi...

Please sign up or login with your details

Forgot password? Click here to reset