Communication-Efficient Federated Learning with Binary Neural Networks

by   Yuzhi Yang, et al.

Federated learning (FL) is a privacy-preserving machine learning setting that enables many devices to jointly train a shared global model without the need to reveal their data to a central server. However, FL involves a frequent exchange of the parameters between all the clients and the server that coordinates the training. This introduces extensive communication overhead, which can be a major bottleneck in FL with limited communication links. In this paper, we consider training the binary neural networks (BNN) in the FL setting instead of the typical real-valued neural networks to fulfill the stringent delay and efficiency requirement in wireless edge networks. We introduce a novel FL framework of training BNN, where the clients only upload the binary parameters to the server. We also propose a novel parameter updating scheme based on the Maximum Likelihood (ML) estimation that preserves the performance of the BNN even without the availability of aggregated real-valued auxiliary parameters that are usually needed during the training of the BNN. Moreover, for the first time in the literature, we theoretically derive the conditions under which the training of BNN is converging. Numerical results show that the proposed FL framework significantly reduces the communication cost compared to the conventional neural networks with typical real-valued parameters, and the performance loss incurred by the binarization can be further compensated by a hybrid method.


page 1

page 5


HyFed: A Hybrid Federated Framework for Privacy-preserving Machine Learning

Federated learning (FL) enables multiple clients to jointly train a glob...

FedDCT: A Dynamic Cross-Tier Federated Learning Scheme in Wireless Communication Networks

With the rapid proliferation of Internet of Things (IoT) devices and the...

Federated Learning over Next-Generation Ethernet Passive Optical Networks

Federated Learning (FL) is a distributed machine learning (ML) type of p...

An Efficient Virtual Data Generation Method for Reducing Communication in Federated Learning

Communication overhead is one of the major challenges in Federated Learn...

Communication-Efficient Federated Learning for Neural Machine Translation

Training neural machine translation (NMT) models in federated learning (...

Communication-Efficient Consensus Mechanism for Federated Reinforcement Learning

The paper considers independent reinforcement learning (IRL) for multi-a...

SuperFed: Weight Shared Federated Learning

Federated Learning (FL) is a well-established technique for privacy pres...

Please sign up or login with your details

Forgot password? Click here to reset