ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training

10/11/2021
by   Hui-Po Wang, et al.
0

Federated learning is a powerful distributed learning scheme that allows numerous edge devices to collaboratively train a model without sharing their data. However, training is resource-intensive for edge devices, and limited network bandwidth is often the main bottleneck. Prior work often overcomes the constraints by condensing the models or messages into compact formats, e.g., by gradient compression or distillation. In contrast, we propose ProgFed, the first progressive training framework for efficient and effective federated learning. It inherently reduces computation and two-way communication costs while maintaining the strong performance of the final models. We theoretically prove that ProgFed converges at the same asymptotic rate as standard training on full models. Extensive results on a broad range of architectures, including CNNs (VGG, ResNet, ConvNets) and U-nets, and diverse tasks from simple classification to medical image segmentation show that our highly effective training approach saves up to 20% computation and up to 63% communication costs for converged models. As our approach is also complimentary to prior work on compression, we can achieve a wide range of trade-offs, showing reduced communication of up to 50× at only 0.1% loss in utility.

READ FULL TEXT
research
11/12/2019

Hyper-Sphere Quantization: Communication-Efficient SGD for Federated Learning

The high cost of communicating gradients is a major bottleneck for feder...
research
09/10/2021

Toward Communication Efficient Adaptive Gradient Method

In recent years, distributed optimization is proven to be an effective a...
research
08/02/2021

Communication-Efficient Federated Learning via Predictive Coding

Federated learning can enable remote workers to collaboratively train a ...
research
11/23/2019

Compressing Representations for Embedded Deep Learning

Despite recent advances in architectures for mobile devices, deep learni...
research
09/11/2023

Towards Federated Learning Under Resource Constraints via Layer-wise Training and Depth Dropout

Large machine learning models trained on diverse data have recently seen...
research
03/15/2022

Privacy-Aware Compression for Federated Data Analysis

Federated data analytics is a framework for distributed data analysis wh...
research
09/23/2020

Edge Learning with Timeliness Constraints: Challenges and Solutions

Future machine learning (ML) powered applications, such as autonomous dr...

Please sign up or login with your details

Forgot password? Click here to reset