-
Asynchronous Online Federated Learning for Edge Devices
Federated learning (FL) is a machine learning paradigm where a shared ce...
read it
-
Communication-Efficient Federated Deep Learning with Asynchronous Model Update and Temporally Weighted Aggregation
Federated learning obtains a central model on the server by aggregating ...
read it
-
Asynchronous Edge Learning using Cloned Knowledge Distillation
With the increasing demand for more and more data, the federated learnin...
read it
-
SAFA: a Semi-Asynchronous Protocol for Fast Federated Learning with Low Overhead
Federated learning (FL) has attracted increasing attention as a promisin...
read it
-
VAFL: a Method of Vertical Asynchronous Federated Learning
Horizontal Federated learning (FL) handles multi-client data that share ...
read it
-
Design and Analysis of Uplink and Downlink Communications for Federated Learning
Communication has been known to be one of the primary bottlenecks of fed...
read it
-
Asynchronous Federated Learning with Reduced Number of Rounds and with Differential Privacy from Less Aggregated Gaussian Noise
The feasibility of federated learning is highly constrained by the serve...
read it
FedAT: A Communication-Efficient Federated Learning Method with Asynchronous Tiers under Non-IID Data
Federated learning (FL) involves training a model over massive distributed devices, while keeping the training data localized. This form of collaborative learning exposes new tradeoffs among model convergence speed, model accuracy, balance across clients, and communication cost, with new challenges including: (1) straggler problem, where the clients lag due to data or (computing and network) resource heterogeneity, and (2) communication bottleneck, where a large number of clients communicate their local updates to a central server and bottleneck the server. Many existing FL methods focus on optimizing along only one dimension of the tradeoff space. Existing solutions use asynchronous model updating or tiering-based synchronous mechanisms to tackle the straggler problem. However, the asynchronous methods can easily create a network communication bottleneck, while tiering may introduce biases as tiering favors faster tiers with shorter response latencies. To address these issues, we present FedAT, a novel Federated learning method with Asynchronous Tiers under Non-i.i.d. data. FedAT synergistically combines synchronous intra-tier training and asynchronous cross-tier training. By bridging the synchronous and asynchronous training through tiering, FedAT minimizes the straggler effect with improved convergence speed and test accuracy. FedAT uses a straggler-aware, weighted aggregation heuristic to steer and balance the training for further accuracy improvement. FedAT compresses the uplink and downlink communications using an efficient, polyline-encoding-based compression algorithm, therefore minimizing the communication cost. Results show that FedAT improves the prediction performance by up to 21.09 communication cost by up to 8.5x, compared to state-of-the-art FL methods.
READ FULL TEXT
Comments
There are no comments yet.