Communication-Efficient Federated Learning with Dual-Side Low-Rank Compression

04/26/2021
by   Zhefeng Qiao, et al.
1

Federated learning (FL) is a promising and powerful approach for training deep learning models without sharing the raw data of clients. During the training process of FL, the central server and distributed clients need to exchange a vast amount of model information periodically. To address the challenge of communication-intensive training, we propose a new training method, referred to as federated learning with dual-side low-rank compression (FedDLR), where the deep learning model is compressed via low-rank approximations at both the server and client sides. The proposed FedDLR not only reduces the communication overhead during the training stage but also directly generates a compact model to speed up the inference process. We shall provide convergence analysis, investigate the influence of the key parameters, and empirically show that FedDLR outperforms the state-of-the-art solutions in terms of both the communication and computation efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/17/2022

Improving Federated Learning Communication Efficiency with Global Momentum Fusion for Gradient Compression Schemes

Communication costs within Federated learning hinder the system scalabil...
research
06/04/2023

Riemannian Low-Rank Model Compression for Federated Learning with Over-the-Air Aggregation

Low-rank model compression is a widely used technique for reducing the c...
research
08/13/2021

FedPara: Low-rank Hadamard Product Parameterization for Efficient Federated Learning

To overcome the burdens on frequent model uploads and downloads during f...
research
02/01/2022

Recycling Model Updates in Federated Learning: Are Gradient Subspaces Low-Rank?

In this paper, we question the rationale behind propagating large number...
research
03/25/2023

Edge Selection and Clustering for Federated Learning in Optical Inter-LEO Satellite Constellation

Low-Earth orbit (LEO) satellites have been prosperously deployed for var...
research
06/20/2022

QuAFL: Federated Averaging Can Be Both Asynchronous and Communication-Efficient

Federated Learning (FL) is an emerging paradigm to enable the large-scal...
research
01/06/2022

Federated Optimization of Smooth Loss Functions

In this work, we study empirical risk minimization (ERM) within a federa...

Please sign up or login with your details

Forgot password? Click here to reset