Bayesian Functional Principal Components Analysis via Variational Message Passing

04/01/2021
by   Tui H. Nolan, et al.
0

Functional principal components analysis is a popular tool for inference on functional data. Standard approaches rely on an eigendecomposition of a smoothed covariance surface in order to extract the orthonormal functions representing the major modes of variation. This approach can be a computationally intensive procedure, especially in the presence of large datasets with irregular observations. In this article, we develop a Bayesian approach, which aims to determine the Karhunen-Loève decomposition directly without the need to smooth and estimate a covariance surface. More specifically, we develop a variational Bayesian algorithm via message passing over a factor graph, which is more commonly referred to as variational message passing. Message passing algorithms are a powerful tool for compartmentalizing the algebra and coding required for inference in hierarchical statistical models. Recently, there has been much focus on formulating variational inference algorithms in the message passing framework because it removes the need for rederiving approximate posterior density functions if there is a change to the model. Instead, model changes are handled by changing specific computational units, known as fragments, within the factor graph. We extend the notion of variational message passing to functional principal components analysis. Indeed, this is the first article to address a functional data model via variational message passing. Our approach introduces two new fragments that are necessary for Bayesian functional principal components analysis. We present the computational details, a set of simulations for assessing accuracy and speed and an application to United States temperature data.

READ FULL TEXT
research
05/20/2020

The Inverse G-Wishart Distribution and Variational Message Passing

Message passing on a factor graph is a powerful paradigm for the coding ...
research
12/25/2021

Reactive Message Passing for Scalable Bayesian Inference

We introduce Reactive Message Passing (RMP) as a framework for executing...
research
04/02/2022

Variational message passing for online polynomial NARMAX identification

We propose a variational Bayesian inference procedure for online nonline...
research
08/19/2022

SimLDA: A tool for topic model evaluation

Variational Bayes (VB) applied to latent Dirichlet allocation (LDA) has ...
research
10/01/2021

ALBU: An approximate Loopy Belief message passing algorithm for LDA to improve performance on small data sets

Variational Bayes (VB) applied to latent Dirichlet allocation (LDA) has ...
research
01/08/2020

Lifted Hybrid Variational Inference

A variety of lifted inference algorithms, which exploit model symmetry t...
research
11/02/2021

Variational message passing (VMP) applied to LDA

Variational Bayes (VB) applied to latent Dirichlet allocation (LDA) is t...

Please sign up or login with your details

Forgot password? Click here to reset