FedCau: A Proactive Stop Policy for Communication and Computation Efficient Federated Learning

04/16/2022
by   Afsaneh Mahmoudi, et al.
0

This paper investigates efficient distributed training of a Federated Learning (FL) model over a wireless network of wireless devices. The communication iterations of the distributed training algorithm may be substantially deteriorated or even blocked by the effects of the devices' background traffic, packet losses, congestion, or latency. We abstract the communication-computation impacts as an `iteration cost' and propose a cost-aware causal FL algorithm (FedCau) to tackle this problem. We propose an iteration-termination method that trade-offs the training performance and networking costs. We apply our approach when clients use the slotted-ALOHA, the carrier-sense multiple access with collision avoidance (CSMA/CA), and the orthogonal frequency-division multiple access (OFDMA) protocols. We show that, given a total cost budget, the training performance degrades as either the background communication traffic or the dimension of the training problem increases. Our results demonstrate the importance of proactively designing optimal cost-efficient stopping criteria to avoid unnecessary communication-computation costs to achieve only a marginal FL training improvement. We validate our method by training and testing FL over the MNIST dataset. Finally, we apply our approach to existing communication efficient FL methods from the literature, achieving further efficiency. We conclude that cost-efficient stopping criteria are essential for the success of practical FL over wireless networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2021

Convergence Analysis and System Design for Federated Learning over Wireless Networks

Federated learning (FL) has recently emerged as an important and promisi...
research
11/19/2021

Over-the-Air Federated Learning with Retransmissions (Extended Version)

Motivated by increasing computational capabilities of wireless devices, ...
research
05/23/2021

Byzantine-Resilient Federated Machine Learning via Over-the-Air Computation

Federated learning (FL) is recognized as a key enabling technology to pr...
research
04/29/2022

Exploration and Exploitation in Federated Learning to Exclude Clients with Poisoned Data

Federated Learning (FL) is one of the hot research topics, and it utiliz...
research
07/04/2021

FedFog: Network-Aware Optimization of Federated Learning over Wireless Fog-Cloud Systems

Federated learning (FL) is capable of performing large distributed machi...
research
07/07/2023

Federated Learning over a Wireless Network: Distributed User Selection through Random Access

User selection has become crucial for decreasing the communication costs...
research
03/23/2023

Failure-tolerant Distributed Learning for Anomaly Detection in Wireless Networks

The analysis of distributed techniques is often focused upon their effic...

Please sign up or login with your details

Forgot password? Click here to reset