Multivariate Gaussian Variational Inference by Natural Gradient Descent

01/27/2020
by   Timothy D. Barfoot, et al.
0

This short note reviews so-called Natural Gradient Descent (NGD) for multivariate Gaussians. The Fisher Information Matrix (FIM) is derived for several different parameterizations of Gaussians. Careful attention is paid to the symmetric nature of the covariance matrix when calculating derivatives. We show that there are some advantages to choosing a parameterization comprising the mean and inverse covariance matrix and provide a simple NGD update that accounts for the symmetric (and sparse) nature of the inverse covariance matrix.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2021

Asymptotic distribution for the proportional covariance model

Asymptotic distribution for the proportional covariance model under mult...
research
06/22/2019

Scalable Bayesian dynamic covariance modeling with variational Wishart and inverse Wishart processes

We implement gradient-based variational inference routines for Wishart a...
research
11/09/2019

Exactly Sparse Gaussian Variational Inference with Application to Derivative-Free Batch Nonlinear State Estimation

We present a Gaussian Variational Inference (GVI) technique that can be ...
research
11/25/2022

The randomization by Wishart laws and the Fisher information

Consider the centered Gaussian vector X in ^n with covariance matrix Σ. ...
research
01/08/2020

Natural Steganography in JPEG Domain with a Linear Development Pipeline

In order to achieve high practical security, Natural Steganography (NS) ...
research
04/29/2014

Fast Approximation of Rotations and Hessians matrices

A new method to represent and approximate rotation matrices is introduce...
research
10/18/2017

Towards a unified theory for testing statistical hypothesis: Multinormal mean with nuisance covariance matrix

Under a multinormal distribution with arbitrary unknown covariance matri...

Please sign up or login with your details

Forgot password? Click here to reset