Communication-Efficient and Drift-Robust Federated Learning via Elastic Net

10/06/2022
by   Seonhyeong Kim, et al.
0

Federated learning (FL) is a distributed method to train a global model over a set of local clients while keeping data localized. It reduces the risks of privacy and security but faces important challenges including expensive communication costs and client drift issues. To address these issues, we propose FedElasticNet, a communication-efficient and drift-robust FL framework leveraging the elastic net. It repurposes two types of the elastic net regularizers (i.e., ℓ_1 and ℓ_2 penalties on the local model updates): (1) the ℓ_1-norm regularizer sparsifies the local updates to reduce the communication costs and (2) the ℓ_2-norm regularizer resolves the client drift problem by limiting the impact of drifting local updates due to data heterogeneity. FedElasticNet is a general framework for FL; hence, without additional costs, it can be integrated into prior FL techniques, e.g., FedAvg, FedProx, SCAFFOLD, and FedDyn. We show that our framework effectively resolves the communication cost and client drift problems simultaneously.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2022

AdaBest: Minimizing Client Drift in Federated Learning via Adaptive Bias Estimation

In Federated Learning (FL), a number of clients or devices collaborate t...
research
12/16/2020

FedADC: Accelerated Federated Learning with Drift Control

Federated learning (FL) has become de facto framework for collaborative ...
research
10/31/2022

Federated Averaging Langevin Dynamics: Toward a unified theory and new algorithms

This paper focuses on Bayesian inference in a federated learning context...
research
09/04/2023

Composite federated learning with heterogeneous data

We propose a novel algorithm for solving the composite Federated Learnin...
research
12/13/2022

Robust Split Federated Learning for U-shaped Medical Image Networks

U-shaped networks are widely used in various medical image tasks, such a...
research
09/04/2023

DRAG: Divergence-based Adaptive Aggregation in Federated learning on Non-IID Data

Local stochastic gradient descent (SGD) is a fundamental approach in ach...
research
06/09/2023

Understanding How Consistency Works in Federated Learning via Stage-wise Relaxed Initialization

Federated learning (FL) is a distributed paradigm that coordinates massi...

Please sign up or login with your details

Forgot password? Click here to reset