FedLite: A Scalable Approach for Federated Learning on Resource-constrained Clients

01/28/2022
by   Jianyu Wang, et al.
0

In classical federated learning, the clients contribute to the overall training by communicating local updates for the underlying model on their private data to a coordinating server. However, updating and communicating the entire model becomes prohibitively expensive when resource-constrained clients collectively aim to train a large machine learning model. Split learning provides a natural solution in such a setting, where only a small part of the model is stored and trained on clients while the remaining large part of the model only stays at the servers. However, the model partitioning employed in split learning introduces a significant amount of communication cost. This paper addresses this issue by compressing the additional communication using a novel clustering scheme accompanied by a gradient correction method. Extensive empirical evaluations on image and text benchmarks show that the proposed method can achieve up to 490× communication cost reduction with minimal drop in accuracy, and enables a desirable performance vs. communication trade-off.

READ FULL TEXT
research
11/08/2022

The Interpolated MVU Mechanism For Communication-efficient Private Federated Learning

We consider private federated learning (FL), where a server aggregates d...
research
09/18/2019

Detailed comparison of communication efficiency of split learning and federated learning

We compare communication efficiencies of two compelling distributed mach...
research
01/21/2021

Time-Correlated Sparsification for Communication-Efficient Federated Learning

Federated learning (FL) enables multiple clients to collaboratively trai...
research
12/05/2021

Intrinisic Gradient Compression for Federated Learning

Federated learning is a rapidly-growing area of research which enables a...
research
05/29/2023

Reducing Communication for Split Learning by Randomized Top-k Sparsification

Split learning is a simple solution for Vertical Federated Learning (VFL...
research
02/27/2023

Communication Trade-offs in Federated Learning of Spiking Neural Networks

Spiking Neural Networks (SNNs) are biologically inspired alternatives to...
research
01/07/2022

Optimizing the Communication-Accuracy Trade-off in Federated Learning with Rate-Distortion Theory

A significant bottleneck in federated learning is the network communicat...

Please sign up or login with your details

Forgot password? Click here to reset