Communication-efficient Federated Learning with Single-Step Synthetic Features Compressor for Faster Convergence

02/27/2023
by   Yuhao Zhou, et al.
0

Reducing communication overhead in federated learning (FL) is challenging but crucial for large-scale distributed privacy-preserving machine learning. While methods utilizing sparsification or others can largely lower the communication overhead, the convergence rate is also greatly compromised. In this paper, we propose a novel method, named single-step synthetic features compressor (3SFC), to achieve communication-efficient FL by directly constructing a tiny synthetic dataset based on raw gradients. Thus, 3SFC can achieve an extremely low compression rate when the constructed dataset contains only one data sample. Moreover, 3SFC's compressing phase utilizes a similarity-based objective function so that it can be optimized with just one step, thereby considerably improving its performance and robustness. In addition, to minimize the compressing error, error feedback (EF) is also incorporated into 3SFC. Experiments on multiple datasets and models suggest that 3SFC owns significantly better convergence rates compared to competing methods with lower compression rates (up to 0.02 visualizations show that 3SFC can carry more information than competing methods for every communication round, further validating its effectiveness.

READ FULL TEXT
research
09/20/2023

Preconditioned Federated Learning

Federated Learning (FL) is a distributed machine learning approach that ...
research
02/06/2023

z-SignFedAvg: A Unified Stochastic Sign-based Compression for Federated Learning

Federated Learning (FL) is a promising privacy-preserving distributed le...
research
03/09/2023

FedREP: A Byzantine-Robust, Communication-Efficient and Privacy-Preserving Framework for Federated Learning

Federated learning (FL) has recently become a hot research topic, in whi...
research
06/01/2021

H-FL: A Hierarchical Communication-Efficient and Privacy-Protected Architecture for Federated Learning

The longstanding goals of federated learning (FL) require rigorous priva...
research
03/02/2022

Towards Efficient and Stable K-Asynchronous Federated Learning with Unbounded Stale Gradients on Non-IID Data

Federated learning (FL) is an emerging privacy-preserving paradigm that ...
research
05/23/2021

Fast Federated Learning by Balancing Communication Trade-Offs

Federated Learning (FL) has recently received a lot of attention for lar...
research
02/23/2023

Federated Nearest Neighbor Machine Translation

To protect user privacy and meet legal regulations, federated learning (...

Please sign up or login with your details

Forgot password? Click here to reset