Knowledge Distillation For Wireless Edge Learning

04/03/2021
by   Ahmed P. Mohamed, et al.
0

In this paper, we propose a framework for predicting frame errors in the collaborative spectrally congested wireless environments of the DARPA Spectrum Collaboration Challenge (SC2) via a recently collected dataset. We employ distributed deep edge learning that is shared among edge nodes and a central cloud. Using this close-to-practice dataset, we find that widely used federated learning approaches, specially those that are privacy preserving, are worse than local training for a wide range of settings. We hence utilize the synthetic minority oversampling technique to maintain privacy via avoiding the transfer of local data to the cloud, and utilize knowledge distillation with an aim to benefit from high cloud computing and storage capabilities. The proposed framework achieves overall better performance than both local and federated training approaches, while being robust against catastrophic failures as well as challenging channel conditions that result in high frame error rates.

READ FULL TEXT
research
10/16/2022

Federated Learning with Privacy-Preserving Ensemble Attention Distillation

Federated Learning (FL) is a machine learning paradigm where many local ...
research
09/10/2022

Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation

Federated Learning (FL) is a machine learning paradigm where local nodes...
research
10/20/2020

Asynchronous Edge Learning using Cloned Knowledge Distillation

With the increasing demand for more and more data, the federated learnin...
research
07/05/2019

Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data

Cooperative training methods for distributed machine learning typically ...
research
04/09/2023

Homogenizing Non-IID datasets via In-Distribution Knowledge Distillation for Decentralized Learning

Decentralized learning enables serverless training of deep neural networ...
research
08/27/2021

Canoe : A System for Collaborative Learning for Neural Nets

For highly distributed environments such as edge computing, collaborativ...

Please sign up or login with your details

Forgot password? Click here to reset