Communication-Efficient Federated Learning via Quantized Compressed Sensing

11/30/2021
by   Yongjeong Oh, et al.
5

In this paper, we present a communication-efficient federated learning framework inspired by quantized compressed sensing. The presented framework consists of gradient compression for wireless devices and gradient reconstruction for a parameter server (PS). Our strategy for gradient compression is to sequentially perform block sparsification, dimensional reduction, and quantization. Thanks to gradient sparsification and quantization, our strategy can achieve a higher compression ratio than one-bit gradient compression. For accurate aggregation of the local gradients from the compressed signals at the PS, we put forth an approximate minimum mean square error (MMSE) approach for gradient reconstruction using the expectation-maximization generalized-approximate-message-passing (EM-GAMP) algorithm. Assuming Bernoulli Gaussian-mixture prior, this algorithm iteratively updates the posterior mean and variance of local gradients from the compressed signals. We also present a low-complexity approach for the gradient reconstruction. In this approach, we use the Bussgang theorem to aggregate local gradients from the compressed signals, then compute an approximate MMSE estimate of the aggregated gradient using the EM-GAMP algorithm. We also provide a convergence rate analysis of the presented framework. Using the MNIST dataset, we demonstrate that the presented framework achieves almost identical performance with the case that performs no compression, while significantly reducing communication overhead for federated learning.

READ FULL TEXT

page 7

page 9

page 15

page 16

page 17

page 19

page 20

page 29

research
04/16/2022

FedVQCS: Federated Learning via Vector Quantized Compressed Sensing

In this paper, a new communication-efficient federated learning (FL) fra...
research
06/12/2022

Communication-Efficient Federated Learning over MIMO Multiple Access Channels

Communication efficiency is of importance for wireless federated learnin...
research
01/24/2023

PolarAir: A Compressed Sensing Scheme for Over-the-Air Federated Learning

We explore a scheme that enables the training of a deep neural network i...
research
01/23/2023

M22: A Communication-Efficient Algorithm for Federated Learning Inspired by Rate-Distortion

In federated learning (FL), the communication constraint between the rem...
research
12/15/2021

Communication-Efficient Distributed SGD with Compressed Sensing

We consider large scale distributed optimization over a set of edge devi...
research
03/03/2021

Temporal-Structure-Assisted Gradient Aggregation for Over-the-Air Federated Edge Learning

In this paper, we investigate over-the-air model aggregation in a federa...
research
12/31/2020

Bayesian Federated Learning over Wireless Networks

Federated learning is a privacy-preserving and distributed training meth...

Please sign up or login with your details

Forgot password? Click here to reset