Client Selection Approach in Support of Clustered Federated Learning over Wireless Edge Networks

08/16/2021
by   Abdullatif Albaseer, et al.
0

Clustered Federated Multitask Learning (CFL) was introduced as an efficient scheme to obtain reliable specialized models when data is imbalanced and distributed in a non-i.i.d. (non-independent and identically distributed) fashion amongst clients. While a similarity measure metric, like the cosine similarity, can be used to endow groups of the client with a specialized model, this process can be arduous as the server should involve all clients in each of the federated learning rounds. Therefore, it is imperative that a subset of clients is selected periodically due to the limited bandwidth and latency constraints at the network edge. To this end, this paper proposes a new client selection algorithm that aims to accelerate the convergence rate for obtaining specialized machine learning models that achieve high test accuracies for all client groups. Specifically, we introduce a client selection approach that leverages the devices' heterogeneity to schedule the clients based on their round latency and exploits the bandwidth reuse for clients that consume more time to update the model. Then, the server performs model averaging and clusters the clients based on predefined thresholds. When a specific cluster reaches a stationary point, the proposed algorithm uses a greedy scheduling algorithm for that group by selecting the clients with less latency to update the model. Extensive experiments show that the proposed approach lowers the training time and accelerates the convergence rate by up to 50 each client with a specialized model that is fit for its local data distribution.

READ FULL TEXT
research
04/26/2023

Fair Selection of Edge Nodes to Participate in Clustered Federated Multitask Learning

Clustered federated Multitask learning is introduced as an efficient tec...
research
02/13/2023

FilFL: Accelerating Federated Learning via Client Filtering

Federated learning is an emerging machine learning paradigm that enables...
research
09/14/2021

Fast Federated Edge Learning with Overlapped Communication and Computation and Channel-Aware Fair Client Scheduling

We consider federated edge learning (FEEL) over wireless fading channels...
research
05/22/2023

When Computing Power Network Meets Distributed Machine Learning: An Efficient Federated Split Learning Framework

In this paper, we advocate CPN-FedSL, a novel and flexible Federated Spl...
research
11/04/2022

Heterogeneity-aware Clustered Distributed Learning for Multi-source Data Analysis

In diverse fields ranging from finance to omics, it is increasingly comm...
research
07/01/2020

Revisiting Comparative Performance of DNS Resolvers in the IPv6 and ECS Era

This paper revisits the issue of the performance of DNS resolution servi...
research
12/11/2021

FedSoft: Soft Clustered Federated Learning with Proximal Local Updating

Traditionally, clustered federated learning groups clients with the same...

Please sign up or login with your details

Forgot password? Click here to reset