Deep Hierarchy Quantization Compression algorithm based on Dynamic Sampling

12/30/2022
by   Wan Jiang, et al.
0

Unlike traditional distributed machine learning, federated learning stores data locally for training and then aggregates the models on the server, which solves the data security problem that may arise in traditional distributed machine learning. However, during the training process, the transmission of model parameters can impose a significant load on the network bandwidth. It has been pointed out that the vast majority of model parameters are redundant during model parameter transmission. In this paper, we explore the data distribution law of selected partial model parameters on this basis, and propose a deep hierarchical quantization compression algorithm, which further compresses the model and reduces the network load brought by data transmission through the hierarchical quantization of model parameters. And we adopt a dynamic sampling strategy for the selection of clients to accelerate the convergence of the model. Experimental results on different public datasets demonstrate the effectiveness of our algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2021

DAdaQuant: Doubly-adaptive quantization for communication-efficient Federated Learning

Federated Learning (FL) is a powerful technique for training a model on ...
research
05/06/2022

Online Model Compression for Federated Learning with Large Models

This paper addresses the challenges of training large neural network mod...
research
12/11/2022

ResFed: Communication Efficient Federated Learning by Transmitting Deep Compressed Residuals

Federated learning enables cooperative training among massively distribu...
research
03/21/2020

Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning

Federated learning (FL) is a novel machine learning setting which enable...
research
06/17/2021

Quantized Federated Learning under Transmission Delay and Outage Constraints

Federated learning (FL) has been recognized as a viable distributed lear...
research
02/01/2023

: Downlink Compression for Cross-Device Federated Learning

Many compression techniques have been proposed to reduce the communicati...
research
11/02/2021

FedGraph: Federated Graph Learning with Intelligent Sampling

Federated learning has attracted much research attention due to its priv...

Please sign up or login with your details

Forgot password? Click here to reset