Federated Learning as Variational Inference: A Scalable Expectation Propagation Approach

02/08/2023
by   Han Guo, et al.
0

The canonical formulation of federated learning treats it as a distributed optimization problem where the model parameters are optimized against a global loss function that decomposes across client loss functions. A recent alternative formulation instead treats federated learning as a distributed inference problem, where the goal is to infer a global posterior from partitioned client data (Al-Shedivat et al., 2021). This paper extends the inference view and describes a variational inference formulation of federated learning where the goal is to find a global variational posterior that well-approximates the true posterior. This naturally motivates an expectation propagation approach to federated learning (FedEP), where approximations to the global posterior are iteratively refined through probabilistic message-passing between the central server and the clients. We conduct an extensive empirical study across various algorithmic considerations and describe practical strategies for scaling up expectation propagation to the modern federated setting. We apply FedEP on standard federated learning benchmarks and find that it outperforms strong baselines in terms of both convergence speed and accuracy.

READ FULL TEXT

page 6

page 8

page 15

research
10/11/2020

Federated Learning via Posterior Averaging: A New Perspective and Practical Algorithms

Federated learning is typically approached as an optimization problem, w...
research
11/19/2021

An Expectation-Maximization Perspective on Federated Learning

Federated learning describes the distributed training of models across m...
research
06/15/2022

Bayesian Federated Learning via Predictive Distribution Distillation

For most existing federated learning algorithms, each round consists of ...
research
05/23/2023

Federated Variational Inference: Towards Improved Personalization and Generalization

Conventional federated learning algorithms train a single global model b...
research
09/11/2020

Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent

This paper introduces Distributed Stein Variational Gradient Descent (DS...
research
11/27/2018

Partitioned Variational Inference: A unified framework encompassing federated and continual learning

Variational inference (VI) has become the method of choice for fitting m...
research
02/24/2022

Partitioned Variational Inference: A Framework for Probabilistic Federated Learning

The proliferation of computing devices has brought about an opportunity ...

Please sign up or login with your details

Forgot password? Click here to reset