Cooperative Learning via Federated Distillation over Fading Channels

02/03/2020
by   Jin-Hyun Ahn, et al.
0

Cooperative training methods for distributed machine learning are typically based on the exchange of local gradients or local model parameters. The latter approach is known as Federated Learning (FL). An alternative solution with reduced communication overhead, referred to as Federated Distillation (FD), was recently proposed that exchanges only averaged model outputs. While prior work studied implementations of FL over wireless fading channels, here we propose wireless protocols for FD and for an enhanced version thereof that leverages an offline communication phase to communicate “mixed-up” covariate vectors. The proposed implementations consist of different combinations of digital schemes based on separate source-channel coding and of over-the-air computing strategies based on analog joint source-channel coding. It is shown that the enhanced version FD has the potential to significantly outperform FL in the presence of limited spectral resources.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2019

Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data

Cooperative training methods for distributed machine learning typically ...
research
11/14/2020

FedRec: Federated Learning of Universal Receivers over Fading Channels

Wireless communications are often subject to fading conditions. Various ...
research
02/28/2020

Decentralized Federated Learning via SGD over Wireless D2D Networks

Federated Learning (FL), an emerging paradigm for fast intelligent acqui...
research
01/12/2022

Federated AirNet: Hybrid Digital-Analog Neural Network Transmission for Federated Learning

A key issue in federated learning over wireless channels is how to excha...
research
01/16/2019

Coded Federated Computing in Wireless Networks with Straggling Devices and Imperfect CSI

Distributed computing platforms typically assume the availability of rel...
research
01/29/2021

Federated Learning over Wireless Device-to-Device Networks: Algorithms and Convergence Analysis

The proliferation of Internet-of-Things (IoT) devices and cloud-computin...
research
11/04/2020

Federated Knowledge Distillation

Distributed learning frameworks often rely on exchanging model parameter...

Please sign up or login with your details

Forgot password? Click here to reset