Harnessing Wireless Channels for Scalable and Privacy-Preserving Federated Learning

07/03/2020
by   Anis Elgabli, et al.
0

Wireless connectivity is instrumental in enabling scalable federated learning (FL), yet wireless channels bring challenges for model training, in which channel randomness perturbs each worker's model update while multiple workers' updates incur significant interference under limited bandwidth. To address these challenges, in this work we formulate a novel constrained optimization problem, and propose an FL framework harnessing wireless channel perturbations and interference for improving privacy, bandwidth-efficiency, and scalability. The resultant algorithm is coined analog federated ADMM (A-FADMM) based on analog transmissions and the alternating direct method of multipliers (ADMM). In A-FADMM, all workers upload their model updates to the parameter server (PS) using a single channel via analog transmissions, during which all models are perturbed and aggregated over-the-air. This not only saves communication bandwidth, but also hides each worker's exact model update trajectory from any eavesdropper including the honest-but-curious PS, thereby preserving data privacy against model inversion attacks. We formally prove the convergence and privacy guarantees of A-FADMM for convex functions under time-varying channels, and numerically show the effectiveness of A-FADMM under noisy channels and stochastic non-convex functions, in terms of convergence speed and scalability, as well as communication bandwidth and energy efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2021

Over-the-Air Federated Learning with Retransmissions (Extended Version)

Motivated by increasing computational capabilities of wireless devices, ...
research
04/08/2021

Joint Optimization of Communications and Federated Learning Over the Air

Federated learning (FL) is an attractive paradigm for making use of rich...
research
04/15/2023

Communication and Energy Efficient Wireless Federated Learning with Intrinsic Privacy

Federated Learning (FL) is a collaborative learning framework that enabl...
research
05/08/2023

Federated Learning in Wireless Networks via Over-the-Air Computations

In a multi-agent system, agents can cooperatively learn a model from dat...
research
06/02/2021

Communication-Efficient Split Learning Based on Analog Communication and Over the Air Aggregation

Split-learning (SL) has recently gained popularity due to its inherent p...
research
05/02/2021

AirMixML: Over-the-Air Data Mixup for Inherently Privacy-Preserving Edge Machine Learning

Wireless channels can be inherently privacy-preserving by distorting the...
research
02/05/2022

Communication Efficient Federated Learning via Ordered ADMM in a Fully Decentralized Setting

The challenge of communication-efficient distributed optimization has at...

Please sign up or login with your details

Forgot password? Click here to reset