Federated Two Stage Decoupling With Adaptive Personalization Layers

08/30/2023
by   Hangyu Zhu, et al.
0

Federated learning has gained significant attention due to its groundbreaking ability to enable distributed learning while maintaining privacy constraints. However, as a consequence of data heterogeneity among decentralized devices, it inherently experiences significant learning degradation and slow convergence speed. Therefore, it is natural to employ the concept of clustering homogeneous clients into the same group, allowing only the model weights within each group to be aggregated. While most existing clustered federated learning methods employ either model gradients or inference outputs as metrics for client partitioning, with the goal of grouping similar devices together, may still have heterogeneity within each cluster. Moreover, there is a scarcity of research exploring the underlying reasons for determining the appropriate timing for clustering, resulting in the common practice of assigning each client to its own individual cluster, particularly in the context of highly non independent and identically distributed (Non-IID) data. In this paper, we introduce a two-stage decoupling federated learning algorithm with adaptive personalization layers named FedTSDP, where client clustering is performed twice according to inference outputs and model weights, respectively. Hopkins amended sampling is adopted to determine the appropriate timing for clustering and the sampling weight of public unlabeled data. In addition, a simple yet effective approach is developed to adaptively adjust the personalization layers based on varying degrees of data skew. Experimental results show that our proposed method has reliable performance on both IID and non-IID scenarios.

READ FULL TEXT

page 6

page 10

research
03/02/2023

Stochastic Clustered Federated Learning

Federated learning is a distributed learning framework that takes full a...
research
03/28/2022

FedADMM: A Federated Primal-Dual Algorithm Allowing Partial Participation

Federated learning is a framework for distributed optimization that plac...
research
09/30/2022

Fed-CBS: A Heterogeneity-Aware Client Sampling Mechanism for Federated Learning via Class-Imbalance Reduction

Due to limited communication capacities of edge devices, most existing f...
research
07/25/2023

FedDRL: A Trustworthy Federated Learning Model Fusion Method Based on Staged Reinforcement Learning

Traditional federated learning uses the number of samples to calculate t...
research
06/14/2023

Provably Personalized and Robust Federated Learning

Clustering clients with similar objectives and learning a model per clus...
research
06/20/2022

Mitigating Data Heterogeneity in Federated Learning with Data Augmentation

Federated Learning (FL) is a prominent framework that enables training a...
research
08/18/2020

Adaptive Distillation for Decentralized Learning from Heterogeneous Clients

This paper addresses the problem of decentralized learning to achieve a ...

Please sign up or login with your details

Forgot password? Click here to reset