Sparser Random Networks Exist: Enforcing Communication-Efficient Federated Learning via Regularization

09/19/2023
by   Mohamad Mestoukirdi, et al.
0

This work presents a new method for enhancing communication efficiency in stochastic Federated Learning that trains over-parameterized random networks. In this setting, a binary mask is optimized instead of the model weights, which are kept fixed. The mask characterizes a sparse sub-network that is able to generalize as good as a smaller target network. Importantly, sparse binary masks are exchanged rather than the floating point weights in traditional federated learning, reducing communication cost to at most 1 bit per parameter. We show that previous state of the art stochastic methods fail to find the sparse networks that can reduce the communication and storage overhead using consistent loss objectives. To address this, we propose adding a regularization term to local objectives that encourages sparser solutions by eliminating redundant features across sub-networks. Extensive experiments demonstrate significant improvements in communication and memory efficiency of up to five magnitudes compared to the literature, with minimal performance degradation in validation accuracy in some instances.

READ FULL TEXT
research
09/30/2022

Sparse Random Networks for Communication-Efficient Federated Learning

One main challenge in federated learning is the large communication cost...
research
01/27/2022

Achieving Personalized Federated Learning with Sparse Local Models

Federated learning (FL) is vulnerable to heterogeneously distributed dat...
research
10/06/2021

Federated Learning via Plurality Vote

Federated learning allows collaborative workers to solve a machine learn...
research
10/06/2021

Efficient and Private Federated Learning with Partially Trainable Networks

Federated learning is used for decentralized training of machine learnin...
research
10/11/2021

Partial Variable Training for Efficient On-Device Federated Learning

This paper aims to address the major challenges of Federated Learning (F...
research
02/02/2022

Communication Efficient Federated Learning for Generalized Linear Bandits

Contextual bandit algorithms have been recently studied under the federa...
research
04/01/2022

Optimising Communication Overhead in Federated Learning Using NSGA-II

Federated learning is a training paradigm according to which a server-ba...

Please sign up or login with your details

Forgot password? Click here to reset