FedVQCS: Federated Learning via Vector Quantized Compressed Sensing

04/16/2022
by   Yongjeong Oh, et al.
0

In this paper, a new communication-efficient federated learning (FL) framework is proposed, inspired by vector quantized compressed sensing. The basic strategy of the proposed framework is to compress the local model update at each device by applying dimensionality reduction followed by vector quantization. Subsequently, the global model update is reconstructed at a parameter server (PS) by applying a sparse signal recovery algorithm to the aggregation of the compressed local model updates. By harnessing the benefits of both dimensionality reduction and vector quantization, the proposed framework effectively reduces the communication overhead of local update transmissions. Both the design of the vector quantizer and the key parameters for the compression are optimized so as to minimize the reconstruction error of the global model update under the constraint of wireless link capacity. By considering the reconstruction error, the convergence rate of the proposed framework is also analyzed for a smooth loss function. Simulation results on the MNIST and CIFAR-10 datasets demonstrate that the proposed framework provides more than a 2.5 state-of-art FL frameworks when the communication overhead of the local model update transmission is less than 0.1 bit per local model entry.

READ FULL TEXT

page 9

page 10

page 14

page 17

page 18

page 19

page 23

page 29

research
07/20/2023

Communication-Efficient Federated Learning over Capacity-Limited Wireless Networks

In this paper, a communication-efficient federated learning (FL) framewo...
research
11/30/2021

Communication-Efficient Federated Learning via Quantized Compressed Sensing

In this paper, we present a communication-efficient federated learning f...
research
07/29/2021

QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning

Traditionally, federated learning (FL) aims to train a single global mod...
research
01/23/2020

Communication Efficient Federated Learning over Multiple Access Channels

In this work, we study the problem of federated learning (FL), where dis...
research
06/01/2021

Wireless Federated Learning with Limited Communication and Differential Privacy

This paper investigates the role of dimensionality reduction in efficien...
research
07/20/2023

Communication-Efficient Split Learning via Adaptive Feature-Wise Compression

This paper proposes a novel communication-efficient split learning (SL) ...
research
03/15/2023

Communication-Efficient Design for Quantized Decentralized Federated Learning

Decentralized federated learning (DFL) is a variant of federated learnin...

Please sign up or login with your details

Forgot password? Click here to reset