Channel-Driven Monte Carlo Sampling for Bayesian Distributed Learning in Wireless Data Centers

03/01/2021
by   Dongzhu Liu, et al.
0

Conventional frequentist learning, as assumed by existing federated learning protocols, is limited in its ability to quantify uncertainty, incorporate prior knowledge, guide active learning, and enable continual learning. Bayesian learning provides a principled approach to address all these limitations, at the cost of an increase in computational complexity. This paper studies distributed Bayesian learning in a wireless data center setting encompassing a central server and multiple distributed workers. Prior work on wireless distributed learning has focused exclusively on frequentist learning, and has introduced the idea of leveraging uncoded transmission to enable "over-the-air" computing. Unlike frequentist learning, Bayesian learning aims at evaluating approximations or samples from a global posterior distribution in the model parameter space. This work investigates for the first time the design of distributed one-shot, or "embarrassingly parallel", Bayesian learning protocols in wireless data centers via consensus Monte Carlo (CMC). Uncoded transmission is introduced not only as a way to implement "over-the-air" computing, but also as a mechanism to deploy channel-driven MC sampling: Rather than treating channel noise as a nuisance to be mitigated, channel-driven sampling utilizes channel noise as an integral part of the MC sampling process. A simple wireless CMC scheme is first proposed that is asymptotically optimal under Gaussian local posteriors. Then, for arbitrary local posteriors, a variational optimization strategy is introduced. Simulation results demonstrate that, if properly accounted for, channel noise can indeed contribute to MC sampling and does not necessarily decrease the accuracy level.

READ FULL TEXT
research
05/07/2023

Bayesian Over-the-Air FedAvg via Channel Driven Stochastic Gradient Langevin Dynamics

The recent development of scalable Bayesian inference methods has renewe...
research
12/17/2021

Coded Consensus Monte Carlo: Robust One-Shot Distributed Bayesian Learning with Stragglers

This letter studies distributed Bayesian learning in a setting encompass...
research
08/17/2021

Wireless Federated Langevin Monte Carlo: Repurposing Channel Noise for Bayesian Sampling and Privacy

Most works on federated learning (FL) focus on the most common frequenti...
research
02/28/2022

Leveraging Channel Noise for Sampling and Privacy via Quantized Federated Langevin Monte Carlo

For engineering applications of artificial intelligence, Bayesian learni...
research
06/27/2012

Monte Carlo Bayesian Reinforcement Learning

Bayesian reinforcement learning (BRL) encodes prior knowledge of the wor...
research
05/27/2021

Nested sampling for frequentist computation: fast estimation of small p-values

We propose a novel method for computing p-values based on nested samplin...
research
02/15/2019

Monte Carlo Sampling Bias in the Microwave Uncertainty Framework

Uncertainty propagation software can have unknown, inadvertent biases in...

Please sign up or login with your details

Forgot password? Click here to reset