Partitioned Variational Inference: A Framework for Probabilistic Federated Learning

02/24/2022
by   Matthew Ashman, et al.
32

The proliferation of computing devices has brought about an opportunity to deploy machine learning models on new problem domains using previously inaccessible data. Traditional algorithms for training such models often require data to be stored on a single machine with compute performed by a single node, making them unsuitable for decentralised training on multiple devices. This deficiency has motivated the development of federated learning algorithms, which allow multiple data owners to train collaboratively and use a shared model whilst keeping local data private. However, many of these algorithms focus on obtaining point estimates of model parameters, rather than probabilistic estimates capable of capturing model uncertainty, which is essential in many applications. Variational inference (VI) has become the method of choice for fitting many modern probabilistic models. In this paper we introduce partitioned variational inference (PVI), a general framework for performing VI in the federated setting. We develop new supporting theory for PVI, demonstrating a number of properties that make it an attractive choice for practitioners; use PVI to unify a wealth of fragmented, yet related literature; and provide empirical results that showcase the effectiveness of PVI in a variety of federated settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2018

Partitioned Variational Inference: A unified framework encompassing federated and continual learning

Variational inference (VI) has become the method of choice for fitting m...
research
02/07/2023

Federated Variational Inference Methods for Structured Latent Variable Models

Federated learning methods, that is, methods that perform model training...
research
09/23/2022

Differentially private partitioned variational inference

Learning a privacy-preserving model from distributed sensitive data is a...
research
05/23/2023

Federated Variational Inference: Towards Improved Personalization and Generalization

Conventional federated learning algorithms train a single global model b...
research
02/08/2023

Federated Learning as Variational Inference: A Scalable Expectation Propagation Approach

The canonical formulation of federated learning treats it as a distribut...
research
06/22/2022

How to Combine Variational Bayesian Networks in Federated Learning

Federated Learning enables multiple data centers to train a central mode...
research
03/17/2023

Generalized partitioned local depth

In this paper we provide a generalization of the concept of cohesion as ...

Please sign up or login with your details

Forgot password? Click here to reset