Does Federated Learning Really Need Backpropagation?

01/28/2023
by   Haozhe Feng, et al.
0

Federated learning (FL) is a general principle for decentralized clients to train a server model collectively without sharing local data. FL is a promising framework with practical applications, but its standard training paradigm requires the clients to backpropagate through the model to compute gradients. Since these clients are typically edge devices and not fully trusted, executing backpropagation on them incurs computational and storage overhead as well as white-box vulnerability. In light of this, we develop backpropagation-free federated learning, dubbed BAFFLE, in which backpropagation is replaced by multiple forward processes to estimate gradients. BAFFLE is 1) memory-efficient and easily fits uploading bandwidth; 2) compatible with inference-only hardware optimization and model quantization or pruning; and 3) well-suited to trusted execution environments, because the clients in BAFFLE only execute forward propagation and return a set of scalars to the server. Empirically we use BAFFLE to train deep models from scratch or to finetune pretrained models, achieving acceptable results. Code is available in https://github.com/FengHZ/BAFFLE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2021

CRFL: Certifiably Robust Federated Learning against Backdoor Attacks

Federated Learning (FL) as a distributed learning paradigm that aggregat...
research
09/03/2023

FedFwd: Federated Learning without Backpropagation

In federated learning (FL), clients with limited resources can disrupt t...
research
03/05/2023

Knowledge-Enhanced Semi-Supervised Federated Learning for Aggregating Heterogeneous Lightweight Clients in IoT

Federated learning (FL) enables multiple clients to train models collabo...
research
04/07/2022

Federated Learning from Only Unlabeled Data with Class-Conditional-Sharing Clients

Supervised federated learning (FL) enables multiple clients to share the...
research
04/09/2022

Local Learning Matters: Rethinking Data Heterogeneity in Federated Learning

Federated learning (FL) is a promising strategy for performing privacy-p...
research
03/10/2023

Complement Sparsification: Low-Overhead Model Pruning for Federated Learning

Federated Learning (FL) is a privacy-preserving distributed deep learnin...
research
03/03/2022

Vertical Federated Principal Component Analysis and Its Kernel Extension on Feature-wise Distributed Data

Despite enormous research interest and rapid application of federated le...

Please sign up or login with your details

Forgot password? Click here to reset