Bayesian Over-the-Air FedAvg via Channel Driven Stochastic Gradient Langevin Dynamics

05/07/2023
by   Boning Zhang, et al.
0

The recent development of scalable Bayesian inference methods has renewed interest in the adoption of Bayesian learning as an alternative to conventional frequentist learning that offers improved model calibration via uncertainty quantification. Recently, federated averaging Langevin dynamics (FALD) was introduced as a variant of federated averaging that can efficiently implement distributed Bayesian learning in the presence of noiseless communications. In this paper, we propose wireless FALD (WFALD), a novel protocol that realizes FALD in wireless systems by integrating over-the-air computation and channel-driven sampling for Monte Carlo updates. Unlike prior work on wireless Bayesian learning, WFALD enables (i) multiple local updates between communication rounds; and (ii) stochastic gradients computed by mini-batch. A convergence analysis is presented in terms of the 2-Wasserstein distance between the samples produced by WFALD and the targeted global posterior distribution. Analysis and experiments show that, when the signal-to-noise ratio is sufficiently large, channel noise can be fully repurposed for Monte Carlo sampling, thus entailing no loss in performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2022

Leveraging Channel Noise for Sampling and Privacy via Quantized Federated Langevin Monte Carlo

For engineering applications of artificial intelligence, Bayesian learni...
research
03/01/2021

Channel-Driven Monte Carlo Sampling for Bayesian Distributed Learning in Wireless Data Centers

Conventional frequentist learning, as assumed by existing federated lear...
research
08/17/2021

Wireless Federated Langevin Monte Carlo: Repurposing Channel Noise for Bayesian Sampling and Privacy

Most works on federated learning (FL) focus on the most common frequenti...
research
12/09/2021

On Convergence of Federated Averaging Langevin Dynamics

We propose a federated averaging Langevin algorithm (FA-LD) for uncertai...
research
12/04/2018

Parallel-tempered Stochastic Gradient Hamiltonian Monte Carlo for Approximate Multimodal Posterior Sampling

We propose a new sampler that integrates the protocol of parallel temper...
research
10/17/2022

Data Subsampling for Bayesian Neural Networks

Markov Chain Monte Carlo (MCMC) algorithms do not scale well for large d...
research
05/21/2021

Removing the mini-batching error in Bayesian inference using Adaptive Langevin dynamics

The computational cost of usual Monte Carlo methods for sampling a poste...

Please sign up or login with your details

Forgot password? Click here to reset