Bayesian Federated Learning over Wireless Networks

12/31/2020
by   Seunghoon Lee, et al.
0

Federated learning is a privacy-preserving and distributed training method using heterogeneous data sets stored at local devices. Federated learning over wireless networks requires aggregating locally computed gradients at a server where the mobile devices send statistically distinct gradient information over heterogenous communication links. This paper proposes a Bayesian federated learning (BFL) algorithm to aggregate the heterogeneous quantized gradient information optimally in the sense of minimizing the mean-squared error (MSE). The idea of BFL is to aggregate the one-bit quantized local gradients at the server by jointly exploiting i) the prior distributions of the local gradients, ii) the gradient quantizer function, and iii) channel distributions. Implementing BFL requires high communication and computational costs as the number of mobile devices increases. To address this challenge, we also present an efficient modified BFL algorithm called scalable-BFL (SBFL). In SBFL, we assume a simplified distribution on the local gradient. Each mobile device sends its one-bit quantized local gradient together with two scalar parameters representing this distribution. The server then aggregates the noisy and faded quantized gradients to minimize the MSE. We provide a convergence analysis of SBFL for a class of non-convex loss functions. Our analysis elucidates how the parameters of communication channels and the gradient priors affect convergence. From simulations, we demonstrate that SBFL considerably outperforms the conventional sign stochastic gradient descent algorithm when training and testing neural networks using MNIST data sets over heterogeneous wireless networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2021

Bayesian AirComp with Sign-Alignment Precoding for Wireless Federated Learning

In this paper, we consider the problem of wireless federated learning ba...
research
07/13/2023

Online Distributed Learning with Quantized Finite-Time Coordination

In this paper we consider online distributed learning problems. Online d...
research
09/10/2019

First Analysis of Local GD on Heterogeneous Data

We provide the first convergence analysis of local gradient descent for ...
research
09/22/2022

One-Shot Federated Learning for Model Clustering and Learning in Heterogeneous Environments

We propose a communication efficient approach for federated learning in ...
research
11/30/2021

Communication-Efficient Federated Learning via Quantized Compressed Sensing

In this paper, we present a communication-efficient federated learning f...
research
03/31/2023

Accelerating Wireless Federated Learning via Nesterov's Momentum and Distributed Principle Component Analysis

A wireless federated learning system is investigated by allowing a serve...
research
02/28/2022

Leveraging Channel Noise for Sampling and Privacy via Quantized Federated Langevin Monte Carlo

For engineering applications of artificial intelligence, Bayesian learni...

Please sign up or login with your details

Forgot password? Click here to reset