Partitioned Variational Inference: A unified framework encompassing federated and continual learning

11/27/2018
by   Thang D. Bui, et al.
0

Variational inference (VI) has become the method of choice for fitting many modern probabilistic models. However, practitioners are faced with a fragmented literature that offers a bewildering array of algorithmic options. First, the variational family. Second, the granularity of the updates e.g. whether the updates are local to each data point and employ message passing or global. Third, the method of optimization (bespoke or blackbox, closed-form or stochastic updates, etc.). This paper presents a new framework, termed Partitioned Variational Inference (PVI), that explicitly acknowledges these algorithmic dimensions of VI, unifies disparate literature, and provides guidance on usage. Crucially, the proposed PVI framework allows us to identify new ways of performing VI that are ideally suited to challenging learning scenarios including federated learning (where distributed computing is leveraged to process non-centralized data) and continual learning (where new data and tasks arrive over time and must be accommodated quickly). We showcase these new capabilities by developing communication-efficient federated training of Bayesian neural networks and continual learning for Gaussian process models with private pseudo-points. The new methods significantly outperform the state-of-the-art, whilst being almost as straightforward to implement as standard VI.

READ FULL TEXT

page 28

page 29

research
02/24/2022

Partitioned Variational Inference: A Framework for Probabilistic Federated Learning

The proliferation of computing devices has brought about an opportunity ...
research
12/04/2019

Indian Buffet Neural Networks for Continual Learning

We place an Indian Buffet Process (IBP) prior over the neural structure ...
research
11/24/2020

Generalized Variational Continual Learning

Continual learning deals with training models on new tasks and datasets ...
research
02/08/2023

Federated Learning as Variational Inference: A Scalable Expectation Propagation Approach

The canonical formulation of federated learning treats it as a distribut...
research
06/09/2020

Variational Auto-Regressive Gaussian Processes for Continual Learning

This paper proposes Variational Auto-Regressive Gaussian Process (VAR-GP...
research
02/07/2023

Federated Variational Inference Methods for Structured Latent Variable Models

Federated learning methods, that is, methods that perform model training...
research
01/31/2019

Functional Regularisation for Continual Learning using Gaussian Processes

We introduce a novel approach for supervised continual learning based on...

Please sign up or login with your details

Forgot password? Click here to reset