A Fast Blockchain-based Federated Learning Framework with Compressed Communications

08/12/2022
by   Laizhong Cui, et al.
0

Recently, blockchain-based federated learning (BFL) has attracted intensive research attention due to that the training process is auditable and the architecture is serverless avoiding the single point failure of the parameter server in vanilla federated learning (VFL). Nevertheless, BFL tremendously escalates the communication traffic volume because all local model updates (i.e., changes of model parameters) obtained by BFL clients will be transmitted to all miners for verification and to all clients for aggregation. In contrast, the parameter server and clients in VFL only retain aggregated model updates. Consequently, the huge communication traffic in BFL will inevitably impair the training efficiency and hinder the deployment of BFL in reality. To improve the practicality of BFL, we are among the first to propose a fast blockchain-based communication-efficient federated learning framework by compressing communications in BFL, called BCFL. Meanwhile, we derive the convergence rate of BCFL with non-convex loss. To maximize the final model accuracy, we further formulate the problem to minimize the training loss of the convergence rate subject to a limited training time with respect to the compression rate and the block generation rate, which is a bi-convex optimization problem and can be efficiently solved. To the end, to demonstrate the efficiency of BCFL, we carry out extensive experiments with standard CIFAR-10 and FEMNIST datasets. Our experimental results not only verify the correctness of our analysis, but also manifest that BCFL can remarkably reduce the communication traffic by 95-98 shorten the training time by 90-95

READ FULL TEXT

page 1

page 15

research
07/19/2021

RingFed: Reducing Communication Costs in Federated Learning on Non-IID Data

Federated learning is a widely used distributed deep learning framework ...
research
05/10/2021

Slashing Communication Traffic in Federated Learning by Transmitting Clustered Model Updates

Federated Learning (FL) is an emerging decentralized learning framework ...
research
12/11/2022

ResFed: Communication Efficient Federated Learning by Transmitting Deep Compressed Residuals

Federated learning enables cooperative training among massively distribu...
research
12/13/2021

Optimal Rate Adaption in Federated Learning with Compressed Communications

Federated Learning (FL) incurs high communication overhead, which can be...
research
02/01/2023

: Downlink Compression for Cross-Device Federated Learning

Many compression techniques have been proposed to reduce the communicati...
research
06/29/2022

AFAFed – Protocol analysis

In this paper, we design, analyze the convergence properties and address...
research
02/06/2021

Multi-Tier Federated Learning for Vertically Partitioned Data

We consider decentralized model training in tiered communication network...

Please sign up or login with your details

Forgot password? Click here to reset