FedCliP: Federated Learning with Client Pruning

01/17/2023
by   Beibei Li, et al.
0

The prevalent communication efficient federated learning (FL) frameworks usually take advantages of model gradient compression or model distillation. However, the unbalanced local data distributions (either in quantity or quality) of participating clients, contributing non-equivalently to the global model training, still pose a big challenge to these works. In this paper, we propose FedCliP, a novel communication efficient FL framework that allows faster model training, by adaptively learning which clients should remain active for further model training and pruning those who should be inactive with less potential contributions. We also introduce an alternative optimization method with a newly defined contribution score measure to facilitate active and inactive client determination. We empirically evaluate the communication efficiency of FL frameworks with extensive experiments on three benchmark datasets under both IID and non-IID settings. Numerical results demonstrate the outperformance of the porposed FedCliP framework over state-of-the-art FL frameworks, i.e., FedCliP can save 70 accuracy loss on MNIST datasets, and save 50 overheads with less than 1 respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2021

Adapt to Adaptation: Learning Personalization for Cross-Silo Federated Learning

The goal of conventional federated learning (FL) is to train a global mo...
research
12/03/2022

GlueFL: Reconciling Client Sampling and Model Masking for Bandwidth Efficient Federated Learning

Federated learning (FL) is an effective technique to directly involve ed...
research
01/28/2022

A Secure and Efficient Federated Learning Framework for NLP

In this work, we consider the problem of designing secure and efficient ...
research
05/02/2023

FedAVO: Improving Communication Efficiency in Federated Learning with African Vultures Optimizer

Federated Learning (FL), a distributed machine learning technique has re...
research
07/20/2022

Improving Privacy-Preserving Vertical Federated Learning by Efficient Communication with ADMM

Federated learning (FL) enables distributed devices to jointly train a s...
research
08/09/2023

Tram-FL: Routing-based Model Training for Decentralized Federated Learning

In decentralized federated learning (DFL), substantial traffic from freq...
research
08/14/2020

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

This study develops a federated learning (FL) framework overcoming large...

Please sign up or login with your details

Forgot password? Click here to reset