Robust and Communication-Efficient Federated Learning from Non-IID Data

03/07/2019
by   Felix Sattler, et al.
12

Federated Learning allows multiple parties to jointly train a deep learning model on their combined data, without any of the participants having to reveal their local data to a centralized server. This form of privacy-preserving collaborative learning however comes at the cost of a significant communication overhead during training. To address this problem, several compression methods have been proposed in the distributed training literature that can reduce the amount of required communication by up to three orders of magnitude. These existing methods however are only of limited utility in the Federated Learning setting, as they either only compress the upstream communication from the clients to the server (leaving the downstream communication uncompressed) or only perform well under idealized conditions such as iid distribution of the client data, which typically can not be found in Federated Learning. In this work, we propose Sparse Ternary Compression (STC), a new compression framework that is specifically designed to meet the requirements of the Federated Learning environment. Our experiments on four different learning tasks demonstrate that STC distinctively outperforms Federated Averaging in common Federated Learning scenarios where clients either a) hold non-iid data, b) use small batch sizes during training, or where c) the number of clients is large and the participation rate in every communication round is low. We furthermore show that even if the clients hold iid data and use medium sized batches for training, STC still behaves pareto-superior to Federated Averaging in the sense that it achieves fixed target accuracies on our benchmarks within both fewer training iterations and a smaller communication budget.

READ FULL TEXT

page 1

page 7

page 15

research
05/05/2022

Communication-Efficient Adaptive Federated Learning

Federated learning is a machine learning training paradigm that enables ...
research
02/01/2023

: Downlink Compression for Cross-Device Federated Learning

Many compression techniques have been proposed to reduce the communicati...
research
06/15/2021

On Large-Cohort Training for Federated Learning

Federated learning methods typically learn a model by iteratively sampli...
research
06/22/2020

Exact Support Recovery in Federated Regression with One-shot Communication

Federated learning provides a framework to address the challenges of dis...
research
08/18/2021

Learning Federated Representations and Recommendations with Limited Negatives

Deep retrieval models are widely used for learning entity representation...
research
03/28/2023

Communication-Efficient Vertical Federated Learning with Limited Overlapping Samples

Federated learning is a popular collaborative learning approach that ena...
research
09/12/2022

Communication-Efficient and Privacy-Preserving Feature-based Federated Transfer Learning

Federated learning has attracted growing interest as it preserves the cl...

Please sign up or login with your details

Forgot password? Click here to reset