
Efficient MCMC Sampling for Bayesian Matrix Factorization by Breaking Posterior Symmetries
Bayesian lowrank matrix factorization techniques have become an essenti...
read it

The GWishart Weighted Proposal Algorithm: Efficient Posterior Computation for Gaussian Graphical Models
Gaussian graphical models can capture complex dependency structures amon...
read it

About the Stein equation for the generalized inverse Gaussian and Kummer distributions
We propose a Stein characterization of the Kummer distribution on (0, ∞)...
read it

Robust Covariance Adaptation in Adaptive Importance Sampling
Importance sampling (IS) is a Monte Carlo methodology that allows for ap...
read it

Efficient sampling from the Bingham distribution
We give a algorithm for exact sampling from the Bingham distribution p(x...
read it

Prediction and Evaluation in College Hockey using the BradleyTerryZermelo Model
We describe the application of the BradleyTerry model to NCAA Division ...
read it

Approximating the Permanent by Sampling from Adaptive Partitions
Computing the permanent of a nonnegative matrix is a core problem with ...
read it
The Matrix Generalized Inverse Gaussian Distribution: Properties and Applications
While the Matrix Generalized Inverse Gaussian (MGIG) distribution arises naturally in some settings as a distribution over symmetric positive semidefinite matrices, certain key properties of the distribution and effective ways of sampling from the distribution have not been carefully studied. In this paper, we show that the MGIG is unimodal, and the mode can be obtained by solving an Algebraic Riccati Equation (ARE) equation [7]. Based on the property, we propose an importance sampling method for the MGIG where the mode of the proposal distribution matches that of the target. The proposed sampling method is more efficient than existing approaches [32, 33], which use proposal distributions that may have the mode far from the MGIG's mode. Further, we illustrate that the the posterior distribution in latent factor models, such as probabilistic matrix factorization (PMF) [25], when marginalized over one latent factor has the MGIG distribution. The characterization leads to a novel Collapsed Monte Carlo (CMC) inference algorithm for such latent factor models. We illustrate that CMC has a lower log loss or perplexity than MCMC, and needs fewer samples.
READ FULL TEXT
Comments
There are no comments yet.