Adaptive Federated Dropout: Improving Communication Efficiency and Generalization for Federated Learning

11/08/2020
by   Nader Bouacida, et al.
0

With more regulations tackling users' privacy-sensitive data protection in recent years, access to such data has become increasingly restricted and controversial. To exploit the wealth of data generated and located at distributed entities such as mobile phones, a revolutionary decentralized machine learning setting, known as Federated Learning, enables multiple clients located at different geographical locations to collaboratively learn a machine learning model while keeping all their data on-device. However, the scale and decentralization of federated learning present new challenges. Communication between the clients and the server is considered a main bottleneck in the convergence time of federated learning. In this paper, we propose and study Adaptive Federated Dropout (AFD), a novel technique to reduce the communication costs associated with federated learning. It optimizes both server-client communications and computation costs by allowing clients to train locally on a selected subset of the global model. We empirically show that this strategy, combined with existing compression methods, collectively provides up to 57x reduction in convergence time. It also outperforms the state-of-the-art solutions for communication efficiency. Furthermore, it improves model generalization by up to 1.7

READ FULL TEXT
research
12/18/2018

Expanding the Reach of Federated Learning by Reducing Client Resource Requirements

Communication on heterogeneous edge networks is a fundamental bottleneck...
research
12/15/2021

LoSAC: An Efficient Local Stochastic Average Control Method for Federated Optimization

Federated optimization (FedOpt), which targets at collaboratively traini...
research
11/19/2021

An Expectation-Maximization Perspective on Federated Learning

Federated learning describes the distributed training of models across m...
research
02/01/2023

: Downlink Compression for Cross-Device Federated Learning

Many compression techniques have been proposed to reduce the communicati...
research
12/12/2020

Communication-Efficient Federated Learning with Compensated Overlap-FedAvg

Petabytes of data are generated each day by emerging Internet of Things ...
research
01/21/2020

Combining Federated and Active Learning for Communication-efficient Distributed Failure Prediction in Aeronautics

Machine Learning has proven useful in the recent years as a way to achie...
research
09/11/2023

Towards Federated Learning Under Resource Constraints via Layer-wise Training and Depth Dropout

Large machine learning models trained on diverse data have recently seen...

Please sign up or login with your details

Forgot password? Click here to reset