Attentive Federated Learning for Concept Drift in Distributed 5G Edge Networks
Machine learning (ML) is expected to play a major role in 5G edge computing. Various studies have demonstrated that ML is highly suitable for optimizing edge computing systems as rapid mobility and application-induced changes occur at the edge. For ML to provide the best solutions, it is important to continually train the ML models to include the changing scenarios. The sudden changes in data distributions caused by changing scenarios (e.g., 5G base station failures) is referred to as concept drift and is a major challenge to continual learning. The ML models can present high error rates while the drifts take place and the errors decrease only after the model learns the distributions. This problem is more pronounced in a distributed setting where multiple ML models are being used for different heterogeneous datasets and the final model needs to capture all concept drifts. In this paper, we show that using Attention in Federated Learning (FL) is an efficient way of handling concept drifts. We use a 5G network traffic dataset to simulate concept drift and test various scenarios. The results indicate that Attention can significantly improve the concept drift handling capability of FL.
READ FULL TEXT