Fast Federated Edge Learning with Overlapped Communication and Computation and Channel-Aware Fair Client Scheduling

09/14/2021
by   Mehmet Emre Ozfatura, et al.
14

We consider federated edge learning (FEEL) over wireless fading channels taking into account the downlink and uplink channel latencies, and the random computation delays at the clients. We speed up the training process by overlapping the communication with computation. With fountain coded transmission of the global model update, clients receive the global model asynchronously, and start performing local computations right away. Then, we propose a dynamic client scheduling policy, called MRTP, for uploading local model updates to the parameter server (PS), which, at any time, schedules the client with the minimum remaining upload time. However, MRTP can lead to biased participation of clients in the update process, resulting in performance degradation in non-iid data scenarios. To overcome this, we propose two alternative schemes with fairness considerations, termed as age-aware MRTP (A-MRTP), and opportunistically fair MRTP (OF-MRTP). In A-MRTP, the remaining clients are scheduled according to the ratio between their remaining transmission time and the update age, while in OF-MRTP, the selection mechanism utilizes the long term average channel rate of the clients to further reduce the latency while ensuring fair participation of the clients. It is shown through numerical simulations that OF-MRTP provides significant reduction in latency without sacrificing test accuracy.

READ FULL TEXT
research
12/31/2020

Timely Communication in Federated Learning

We consider a federated learning framework in which a parameter server (...
research
08/16/2021

Client Selection Approach in Support of Clustered Federated Learning over Wireless Edge Networks

Clustered Federated Multitask Learning (CFL) was introduced as an effici...
research
02/24/2022

Robust Federated Learning with Connectivity Failures: A Semi-Decentralized Framework with Collaborative Relaying

Intermittent client connectivity is one of the major challenges in centr...
research
07/28/2022

FedVARP: Tackling the Variance Due to Partial Client Participation in Federated Learning

Data-heterogeneous federated learning (FL) systems suffer from two signi...
research
06/21/2023

Timely Asynchronous Hierarchical Federated Learning: Age of Convergence

We consider an asynchronous hierarchical federated learning (AHFL) setti...
research
04/15/2023

Multi-Server Secure Aggregation with Unreliable Communication Links

In many distributed learning setups such as federated learning (FL), cli...
research
12/11/2022

Client Selection for Federated Bayesian Learning

Distributed Stein Variational Gradient Descent (DSVGD) is a non-parametr...

Please sign up or login with your details

Forgot password? Click here to reset